Dec 10 11:51:52 crc systemd[1]: Starting Kubernetes Kubelet... Dec 10 11:51:52 crc restorecon[4683]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:52 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 10 11:51:53 crc restorecon[4683]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 10 11:51:54 crc kubenswrapper[4852]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 10 11:51:54 crc kubenswrapper[4852]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 10 11:51:54 crc kubenswrapper[4852]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 10 11:51:54 crc kubenswrapper[4852]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 10 11:51:54 crc kubenswrapper[4852]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 10 11:51:54 crc kubenswrapper[4852]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.028745 4852 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031618 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031635 4852 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031639 4852 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031645 4852 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031649 4852 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031654 4852 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031658 4852 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031662 4852 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031667 4852 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031672 4852 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031677 4852 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031695 4852 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031700 4852 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031704 4852 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031713 4852 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031716 4852 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031720 4852 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031723 4852 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031727 4852 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031731 4852 feature_gate.go:330] unrecognized feature gate: Example Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031735 4852 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031739 4852 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031744 4852 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031749 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031754 4852 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031759 4852 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031770 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031775 4852 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031780 4852 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031785 4852 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031791 4852 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031796 4852 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031800 4852 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031805 4852 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031810 4852 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031815 4852 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031819 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031823 4852 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031828 4852 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031832 4852 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031837 4852 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031841 4852 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031848 4852 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031853 4852 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031857 4852 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031861 4852 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031865 4852 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031869 4852 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031874 4852 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031878 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031885 4852 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031891 4852 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031895 4852 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031900 4852 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031904 4852 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031909 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031915 4852 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031919 4852 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031922 4852 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031926 4852 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031930 4852 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031935 4852 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031939 4852 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031942 4852 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031945 4852 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031949 4852 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031952 4852 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031956 4852 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031959 4852 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031963 4852 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.031967 4852 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032180 4852 flags.go:64] FLAG: --address="0.0.0.0" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032190 4852 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032199 4852 flags.go:64] FLAG: --anonymous-auth="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032205 4852 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032211 4852 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032215 4852 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032221 4852 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032226 4852 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032245 4852 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032249 4852 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032254 4852 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032258 4852 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032263 4852 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032268 4852 flags.go:64] FLAG: --cgroup-root="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032272 4852 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032277 4852 flags.go:64] FLAG: --client-ca-file="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032281 4852 flags.go:64] FLAG: --cloud-config="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032284 4852 flags.go:64] FLAG: --cloud-provider="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032288 4852 flags.go:64] FLAG: --cluster-dns="[]" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032294 4852 flags.go:64] FLAG: --cluster-domain="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032298 4852 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032302 4852 flags.go:64] FLAG: --config-dir="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032306 4852 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032310 4852 flags.go:64] FLAG: --container-log-max-files="5" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032316 4852 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032321 4852 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032325 4852 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032329 4852 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032333 4852 flags.go:64] FLAG: --contention-profiling="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032337 4852 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032342 4852 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032346 4852 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032350 4852 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032356 4852 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032360 4852 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032364 4852 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032369 4852 flags.go:64] FLAG: --enable-load-reader="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032373 4852 flags.go:64] FLAG: --enable-server="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032377 4852 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032381 4852 flags.go:64] FLAG: --event-burst="100" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032386 4852 flags.go:64] FLAG: --event-qps="50" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032390 4852 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032394 4852 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032398 4852 flags.go:64] FLAG: --eviction-hard="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032403 4852 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032407 4852 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032411 4852 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032416 4852 flags.go:64] FLAG: --eviction-soft="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032420 4852 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032424 4852 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032428 4852 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032433 4852 flags.go:64] FLAG: --experimental-mounter-path="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032437 4852 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032441 4852 flags.go:64] FLAG: --fail-swap-on="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032445 4852 flags.go:64] FLAG: --feature-gates="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032450 4852 flags.go:64] FLAG: --file-check-frequency="20s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032455 4852 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032459 4852 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032463 4852 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032467 4852 flags.go:64] FLAG: --healthz-port="10248" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032471 4852 flags.go:64] FLAG: --help="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032476 4852 flags.go:64] FLAG: --hostname-override="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032480 4852 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032484 4852 flags.go:64] FLAG: --http-check-frequency="20s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032488 4852 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032492 4852 flags.go:64] FLAG: --image-credential-provider-config="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032496 4852 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032501 4852 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032505 4852 flags.go:64] FLAG: --image-service-endpoint="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032510 4852 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032513 4852 flags.go:64] FLAG: --kube-api-burst="100" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032518 4852 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032522 4852 flags.go:64] FLAG: --kube-api-qps="50" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032526 4852 flags.go:64] FLAG: --kube-reserved="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032531 4852 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032535 4852 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032539 4852 flags.go:64] FLAG: --kubelet-cgroups="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032543 4852 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032547 4852 flags.go:64] FLAG: --lock-file="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032551 4852 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032555 4852 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032559 4852 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032568 4852 flags.go:64] FLAG: --log-json-split-stream="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032572 4852 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032576 4852 flags.go:64] FLAG: --log-text-split-stream="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032580 4852 flags.go:64] FLAG: --logging-format="text" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032584 4852 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032589 4852 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032593 4852 flags.go:64] FLAG: --manifest-url="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032597 4852 flags.go:64] FLAG: --manifest-url-header="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032603 4852 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032607 4852 flags.go:64] FLAG: --max-open-files="1000000" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032613 4852 flags.go:64] FLAG: --max-pods="110" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032617 4852 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032621 4852 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032625 4852 flags.go:64] FLAG: --memory-manager-policy="None" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032629 4852 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032633 4852 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032637 4852 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032643 4852 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032656 4852 flags.go:64] FLAG: --node-status-max-images="50" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032661 4852 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032665 4852 flags.go:64] FLAG: --oom-score-adj="-999" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032670 4852 flags.go:64] FLAG: --pod-cidr="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032674 4852 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032680 4852 flags.go:64] FLAG: --pod-manifest-path="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032684 4852 flags.go:64] FLAG: --pod-max-pids="-1" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032689 4852 flags.go:64] FLAG: --pods-per-core="0" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032693 4852 flags.go:64] FLAG: --port="10250" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032698 4852 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032701 4852 flags.go:64] FLAG: --provider-id="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032706 4852 flags.go:64] FLAG: --qos-reserved="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032709 4852 flags.go:64] FLAG: --read-only-port="10255" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032714 4852 flags.go:64] FLAG: --register-node="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032719 4852 flags.go:64] FLAG: --register-schedulable="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032723 4852 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032730 4852 flags.go:64] FLAG: --registry-burst="10" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032734 4852 flags.go:64] FLAG: --registry-qps="5" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032738 4852 flags.go:64] FLAG: --reserved-cpus="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032742 4852 flags.go:64] FLAG: --reserved-memory="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032747 4852 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032751 4852 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032755 4852 flags.go:64] FLAG: --rotate-certificates="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032760 4852 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032764 4852 flags.go:64] FLAG: --runonce="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032768 4852 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032772 4852 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032776 4852 flags.go:64] FLAG: --seccomp-default="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032781 4852 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032785 4852 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032790 4852 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032795 4852 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032800 4852 flags.go:64] FLAG: --storage-driver-password="root" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032804 4852 flags.go:64] FLAG: --storage-driver-secure="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032808 4852 flags.go:64] FLAG: --storage-driver-table="stats" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032812 4852 flags.go:64] FLAG: --storage-driver-user="root" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032816 4852 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032820 4852 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032824 4852 flags.go:64] FLAG: --system-cgroups="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032828 4852 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032834 4852 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032838 4852 flags.go:64] FLAG: --tls-cert-file="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032842 4852 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032847 4852 flags.go:64] FLAG: --tls-min-version="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032852 4852 flags.go:64] FLAG: --tls-private-key-file="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032855 4852 flags.go:64] FLAG: --topology-manager-policy="none" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032861 4852 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032865 4852 flags.go:64] FLAG: --topology-manager-scope="container" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032869 4852 flags.go:64] FLAG: --v="2" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032874 4852 flags.go:64] FLAG: --version="false" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032879 4852 flags.go:64] FLAG: --vmodule="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032884 4852 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.032888 4852 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.032980 4852 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.032985 4852 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.032989 4852 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.032993 4852 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.032996 4852 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033000 4852 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033004 4852 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033008 4852 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033011 4852 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033014 4852 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033018 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033021 4852 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033025 4852 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033028 4852 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033032 4852 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033035 4852 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033039 4852 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033042 4852 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033045 4852 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033049 4852 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033053 4852 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033056 4852 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033060 4852 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033063 4852 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033067 4852 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033072 4852 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033075 4852 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033079 4852 feature_gate.go:330] unrecognized feature gate: Example Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033083 4852 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033087 4852 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033091 4852 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033095 4852 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033098 4852 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033102 4852 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033105 4852 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033109 4852 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033112 4852 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033116 4852 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033119 4852 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033123 4852 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033126 4852 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033130 4852 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033133 4852 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033138 4852 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033141 4852 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033146 4852 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033150 4852 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033154 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033158 4852 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033162 4852 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033166 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033170 4852 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033174 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033179 4852 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033182 4852 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033186 4852 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033189 4852 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033194 4852 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033198 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033202 4852 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033205 4852 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033210 4852 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033214 4852 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033218 4852 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033222 4852 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033226 4852 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033245 4852 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033249 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033253 4852 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033256 4852 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.033260 4852 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.033418 4852 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.043091 4852 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.043119 4852 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043192 4852 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043200 4852 feature_gate.go:330] unrecognized feature gate: Example Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043205 4852 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043209 4852 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043213 4852 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043217 4852 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043222 4852 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043226 4852 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043242 4852 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043247 4852 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043251 4852 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043255 4852 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043258 4852 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043262 4852 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043266 4852 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043269 4852 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043273 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043276 4852 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043280 4852 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043283 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043287 4852 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043291 4852 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043294 4852 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043298 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043302 4852 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043308 4852 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043319 4852 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043323 4852 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043327 4852 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043330 4852 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043334 4852 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043337 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043341 4852 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043344 4852 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043348 4852 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043351 4852 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043355 4852 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043358 4852 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043362 4852 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043365 4852 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043369 4852 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043372 4852 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043376 4852 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043382 4852 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043386 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043390 4852 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043395 4852 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043400 4852 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043404 4852 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043407 4852 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043412 4852 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043416 4852 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043420 4852 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043424 4852 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043428 4852 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043432 4852 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043435 4852 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043439 4852 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043442 4852 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043446 4852 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043449 4852 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043453 4852 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043461 4852 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043464 4852 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043468 4852 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043471 4852 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043475 4852 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043478 4852 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043482 4852 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043485 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043489 4852 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.043495 4852 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043638 4852 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043645 4852 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043651 4852 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043656 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043660 4852 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043665 4852 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043668 4852 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043672 4852 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043676 4852 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043680 4852 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043684 4852 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043688 4852 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043691 4852 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043695 4852 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043699 4852 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043702 4852 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043706 4852 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043710 4852 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043715 4852 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043719 4852 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043722 4852 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043725 4852 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043729 4852 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043732 4852 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043736 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043740 4852 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043754 4852 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043758 4852 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043764 4852 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043768 4852 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043772 4852 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043776 4852 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043779 4852 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043783 4852 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043786 4852 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043790 4852 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043794 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043797 4852 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043801 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043804 4852 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043808 4852 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043811 4852 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043814 4852 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043818 4852 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043821 4852 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043825 4852 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043828 4852 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043831 4852 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043835 4852 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043838 4852 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043842 4852 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043845 4852 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043849 4852 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043852 4852 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043855 4852 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043859 4852 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043862 4852 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043866 4852 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043869 4852 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043872 4852 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043876 4852 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043880 4852 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043890 4852 feature_gate.go:330] unrecognized feature gate: Example Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043895 4852 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043899 4852 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043903 4852 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043907 4852 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043911 4852 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043915 4852 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043918 4852 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.043922 4852 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.043928 4852 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.045132 4852 server.go:940] "Client rotation is on, will bootstrap in background" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.048854 4852 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.048957 4852 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.049489 4852 server.go:997] "Starting client certificate rotation" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.049523 4852 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.049937 4852 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-06 21:56:26.854202634 +0000 UTC Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.050068 4852 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.054614 4852 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.056659 4852 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.056782 4852 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.062927 4852 log.go:25] "Validated CRI v1 runtime API" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.075002 4852 log.go:25] "Validated CRI v1 image API" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.077103 4852 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.079290 4852 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-10-11-47-37-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.079324 4852 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.096660 4852 manager.go:217] Machine: {Timestamp:2025-12-10 11:51:54.095506601 +0000 UTC m=+0.181031845 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8aceede4-7323-43e8-a979-088fd86df0ad BootID:ec81da2d-010c-440a-a2c9-f3547047ac06 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2a:ca:1c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2a:ca:1c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ad:0c:2d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:41:14:91 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:15:c1:7a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ea:c6:2f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:36:8c:89:52:8b:63 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:b2:c4:36:75:ba:b3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.096882 4852 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.097097 4852 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.097890 4852 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.098246 4852 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.098304 4852 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.098595 4852 topology_manager.go:138] "Creating topology manager with none policy" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.098608 4852 container_manager_linux.go:303] "Creating device plugin manager" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.098857 4852 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.098905 4852 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.099141 4852 state_mem.go:36] "Initialized new in-memory state store" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.099263 4852 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.100253 4852 kubelet.go:418] "Attempting to sync node with API server" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.100278 4852 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.100305 4852 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.100321 4852 kubelet.go:324] "Adding apiserver pod source" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.100335 4852 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.102526 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.102564 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.102669 4852 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.102669 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.102697 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.103120 4852 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104175 4852 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104850 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104891 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104899 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104908 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104922 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104930 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104937 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104951 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104966 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104974 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104985 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.104994 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.105184 4852 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.106288 4852 server.go:1280] "Started kubelet" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.106537 4852 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.106682 4852 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.106736 4852 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.107843 4852 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.108551 4852 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.73:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187fd8669850d4b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-10 11:51:54.106221752 +0000 UTC m=+0.191746976,LastTimestamp:2025-12-10 11:51:54.106221752 +0000 UTC m=+0.191746976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.109458 4852 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.109488 4852 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.109614 4852 server.go:460] "Adding debug handlers to kubelet server" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.109796 4852 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.109819 4852 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.109852 4852 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.109977 4852 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.110322 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="200ms" Dec 10 11:51:54 crc systemd[1]: Started Kubernetes Kubelet. Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.109534 4852 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 08:54:01.350356515 +0000 UTC Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.118331 4852 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 141h2m7.232032892s for next certificate rotation Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.118090 4852 factory.go:55] Registering systemd factory Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.118365 4852 factory.go:221] Registration of the systemd container factory successfully Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.118184 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.118444 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.118755 4852 factory.go:153] Registering CRI-O factory Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.118777 4852 factory.go:221] Registration of the crio container factory successfully Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.118891 4852 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.119528 4852 factory.go:103] Registering Raw factory Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.119625 4852 manager.go:1196] Started watching for new ooms in manager Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.120369 4852 manager.go:319] Starting recovery of all containers Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127807 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127860 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127875 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127891 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127904 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127917 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127930 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127946 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.127990 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128002 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128012 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128024 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128034 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128047 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128058 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128069 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128080 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128090 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128101 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128128 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128138 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128169 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128179 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128189 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128199 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128211 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128223 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128256 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128294 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128306 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128319 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128338 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128352 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128365 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128399 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128414 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128427 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128439 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128453 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128467 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128480 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128493 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128505 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128518 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128535 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128546 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128555 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128566 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128577 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128587 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128598 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128610 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128627 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128640 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128667 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128681 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128694 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128707 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128720 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128733 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128745 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128758 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128769 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128782 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128800 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128820 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128838 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128853 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128869 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128887 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128902 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128916 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128930 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.128946 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.131407 4852 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.131929 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.131960 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.131983 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132001 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132022 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132037 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132055 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132072 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132089 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132105 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132120 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132137 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132158 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132179 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132194 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132216 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132287 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132303 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132317 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132331 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132348 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132362 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132376 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132394 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132408 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132423 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132436 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132452 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132466 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132480 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132505 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132538 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132554 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132570 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132584 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132599 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132612 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132627 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132641 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132652 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132664 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132675 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132687 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132698 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132709 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132719 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132731 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132741 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132752 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132765 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132776 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132789 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132800 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132812 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132823 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132834 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132843 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132854 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132864 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132874 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132885 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132895 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132905 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132915 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132962 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132973 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132984 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.132998 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133009 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133021 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133033 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133045 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133088 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133105 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133115 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133134 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133145 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133156 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133169 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133181 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133193 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133205 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133215 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133230 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133259 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133271 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133284 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133295 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133305 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133317 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133329 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133339 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133357 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133368 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133378 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133390 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133402 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133413 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133425 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133437 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133449 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133461 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133472 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133484 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133495 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133506 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133518 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133530 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133540 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133551 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133562 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133573 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133584 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133594 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133607 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133619 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133629 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133642 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133652 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133663 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133673 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133683 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133694 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133704 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133715 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133726 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133736 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133748 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133758 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133768 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133782 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133794 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133804 4852 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133813 4852 reconstruct.go:97] "Volume reconstruction finished" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.133821 4852 reconciler.go:26] "Reconciler: start to sync state" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.137822 4852 manager.go:324] Recovery completed Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.146460 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.147807 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.147844 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.147855 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.148975 4852 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.148995 4852 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.149014 4852 state_mem.go:36] "Initialized new in-memory state store" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.162783 4852 policy_none.go:49] "None policy: Start" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.165275 4852 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.165314 4852 state_mem.go:35] "Initializing new in-memory state store" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.166366 4852 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.168470 4852 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.168509 4852 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.168540 4852 kubelet.go:2335] "Starting kubelet main sync loop" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.168590 4852 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.169938 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.170006 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.210103 4852 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.229628 4852 manager.go:334] "Starting Device Plugin manager" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.229685 4852 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.229696 4852 server.go:79] "Starting device plugin registration server" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.230115 4852 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.230135 4852 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.230496 4852 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.230585 4852 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.230601 4852 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.239186 4852 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.268662 4852 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.268782 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.270139 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.270170 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.270180 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.270287 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.270676 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.270745 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.271178 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.271217 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.271242 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.271413 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.271514 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.271548 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272455 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272482 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272494 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272501 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272544 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272556 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272495 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272597 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272616 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272762 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272836 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.272866 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.273757 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.273776 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.273791 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.273799 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.273801 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.273813 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.273979 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.274087 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.274124 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.274781 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.274811 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.274830 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.274970 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.274991 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.275588 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.275607 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.275618 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.276152 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.276182 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.276194 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.311189 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="400ms" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.330667 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.332058 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.332094 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.332106 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.332133 4852 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.333381 4852 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.73:6443: connect: connection refused" node="crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336414 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336446 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336465 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336503 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336519 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336535 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336551 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336566 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336585 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336607 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336675 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336717 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336767 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336801 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.336828 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437543 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437607 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437633 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437871 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437917 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437908 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437961 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437937 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438019 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437767 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438045 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437800 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438072 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438094 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438111 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438116 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438134 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438162 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438161 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438183 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438162 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438214 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438249 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438218 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438269 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438288 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.437960 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438320 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438372 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.438320 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.534598 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.536603 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.536672 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.536681 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.536707 4852 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.537327 4852 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.73:6443: connect: connection refused" node="crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.604311 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.615715 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.621405 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.625994 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a900bf766dd88e87e551a75cda039de73edeeae1bf1af13d90cd4645d2b3bc62 WatchSource:0}: Error finding container a900bf766dd88e87e551a75cda039de73edeeae1bf1af13d90cd4645d2b3bc62: Status 404 returned error can't find the container with id a900bf766dd88e87e551a75cda039de73edeeae1bf1af13d90cd4645d2b3bc62 Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.631623 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6d1b7288106080190e86d7f2439e82718c7e6342a9462a912617139ed8a9e1eb WatchSource:0}: Error finding container 6d1b7288106080190e86d7f2439e82718c7e6342a9462a912617139ed8a9e1eb: Status 404 returned error can't find the container with id 6d1b7288106080190e86d7f2439e82718c7e6342a9462a912617139ed8a9e1eb Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.636635 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-9e7f71ff436a7a2d700fdf5e64af5ddcb5ffbaeca9fec7e35a9b6835cdacbe66 WatchSource:0}: Error finding container 9e7f71ff436a7a2d700fdf5e64af5ddcb5ffbaeca9fec7e35a9b6835cdacbe66: Status 404 returned error can't find the container with id 9e7f71ff436a7a2d700fdf5e64af5ddcb5ffbaeca9fec7e35a9b6835cdacbe66 Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.650709 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.659196 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:51:54 crc kubenswrapper[4852]: W1210 11:51:54.675005 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a57b4fba5f287b250bd54bd8ab1a3070d8bbd7e079df13537131a2ee5fe95408 WatchSource:0}: Error finding container a57b4fba5f287b250bd54bd8ab1a3070d8bbd7e079df13537131a2ee5fe95408: Status 404 returned error can't find the container with id a57b4fba5f287b250bd54bd8ab1a3070d8bbd7e079df13537131a2ee5fe95408 Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.712992 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="800ms" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.937796 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.940044 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.940094 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.940107 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:54 crc kubenswrapper[4852]: I1210 11:51:54.940133 4852 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 10 11:51:54 crc kubenswrapper[4852]: E1210 11:51:54.941089 4852 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.73:6443: connect: connection refused" node="crc" Dec 10 11:51:55 crc kubenswrapper[4852]: W1210 11:51:55.083370 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:55 crc kubenswrapper[4852]: E1210 11:51:55.083444 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.107424 4852 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.173563 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6d1b7288106080190e86d7f2439e82718c7e6342a9462a912617139ed8a9e1eb"} Dec 10 11:51:55 crc kubenswrapper[4852]: W1210 11:51:55.174190 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:55 crc kubenswrapper[4852]: E1210 11:51:55.174303 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.175481 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1bc9fa99429ec1ea462d4f5c12794d4b3b27fa3b3feb7eb45828649083bc007c"} Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.175558 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a900bf766dd88e87e551a75cda039de73edeeae1bf1af13d90cd4645d2b3bc62"} Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.176448 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a57b4fba5f287b250bd54bd8ab1a3070d8bbd7e079df13537131a2ee5fe95408"} Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.177561 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1"} Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.177616 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"398410c6669fb4dc5b0bf49166c3524f7772e4fcf4fc8b3bac8c78236c945203"} Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.178580 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9e7f71ff436a7a2d700fdf5e64af5ddcb5ffbaeca9fec7e35a9b6835cdacbe66"} Dec 10 11:51:55 crc kubenswrapper[4852]: W1210 11:51:55.344703 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:55 crc kubenswrapper[4852]: E1210 11:51:55.344816 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:55 crc kubenswrapper[4852]: E1210 11:51:55.513691 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="1.6s" Dec 10 11:51:55 crc kubenswrapper[4852]: W1210 11:51:55.581210 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:55 crc kubenswrapper[4852]: E1210 11:51:55.581311 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.741698 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.743480 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.743536 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.743546 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:55 crc kubenswrapper[4852]: I1210 11:51:55.743572 4852 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 10 11:51:55 crc kubenswrapper[4852]: E1210 11:51:55.744211 4852 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.73:6443: connect: connection refused" node="crc" Dec 10 11:51:56 crc kubenswrapper[4852]: I1210 11:51:56.107735 4852 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:56 crc kubenswrapper[4852]: I1210 11:51:56.110846 4852 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 10 11:51:56 crc kubenswrapper[4852]: E1210 11:51:56.111630 4852 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:56 crc kubenswrapper[4852]: W1210 11:51:56.739443 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:56 crc kubenswrapper[4852]: E1210 11:51:56.739543 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:57 crc kubenswrapper[4852]: I1210 11:51:57.108040 4852 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:57 crc kubenswrapper[4852]: E1210 11:51:57.114958 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="3.2s" Dec 10 11:51:57 crc kubenswrapper[4852]: I1210 11:51:57.344767 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:57 crc kubenswrapper[4852]: I1210 11:51:57.346006 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:57 crc kubenswrapper[4852]: I1210 11:51:57.346057 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:57 crc kubenswrapper[4852]: I1210 11:51:57.346069 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:57 crc kubenswrapper[4852]: I1210 11:51:57.346099 4852 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 10 11:51:57 crc kubenswrapper[4852]: E1210 11:51:57.346607 4852 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.73:6443: connect: connection refused" node="crc" Dec 10 11:51:57 crc kubenswrapper[4852]: W1210 11:51:57.462389 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:57 crc kubenswrapper[4852]: E1210 11:51:57.462463 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:57 crc kubenswrapper[4852]: W1210 11:51:57.670884 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:57 crc kubenswrapper[4852]: E1210 11:51:57.670931 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:57 crc kubenswrapper[4852]: W1210 11:51:57.755109 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:57 crc kubenswrapper[4852]: E1210 11:51:57.755187 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.73:6443: connect: connection refused" logger="UnhandledError" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.105607 4852 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1bc9fa99429ec1ea462d4f5c12794d4b3b27fa3b3feb7eb45828649083bc007c" exitCode=0 Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.105773 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1bc9fa99429ec1ea462d4f5c12794d4b3b27fa3b3feb7eb45828649083bc007c"} Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.105829 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.106965 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.106997 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.107010 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.112027 4852 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.114015 4852 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c" exitCode=0 Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.114114 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c"} Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.114223 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.115620 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.115667 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.115679 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.116885 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e"} Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.118038 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.118454 4852 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a05c51242468b3348094bff2f00f913953cd8551ddb1f58871ee4253e6746015" exitCode=0 Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.118529 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a05c51242468b3348094bff2f00f913953cd8551ddb1f58871ee4253e6746015"} Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.118606 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.119448 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.119453 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.119508 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.119593 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.119598 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.119609 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.119924 4852 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a7efdfa6cf36e4e4dbfde699a67562d14ac49c3c526ff636c9c0b6d11b3e1c63" exitCode=0 Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.119960 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a7efdfa6cf36e4e4dbfde699a67562d14ac49c3c526ff636c9c0b6d11b3e1c63"} Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.120157 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.121531 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.121556 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:58 crc kubenswrapper[4852]: I1210 11:51:58.121565 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.108382 4852 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.73:6443: connect: connection refused Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.132693 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.133332 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6e42d0d86ab254f316e18daf51e8e30f9ce0cc03ecda16538c199f47ea961f88"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.134439 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.134468 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.134479 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.137651 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3b5cc2247c1194b57d9a4a27cdd3d174e936ce04a79f8b890ef46dd51c03677d"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.137685 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"84fda3caa60f7cf413021a5ee145d75df479e96cd4c2edf0328ec6252c53d46c"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.137696 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e8b74c9172eb3a56c30876e5522ce1430f21dba641c0bda1f853b4633cb4728"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.137785 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.138900 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.138927 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.138938 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.141381 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.141414 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.141428 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.166607 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.166656 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.166743 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.167459 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.167486 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.167497 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.169040 4852 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8fa9f3126f3c5830e24df4ee034d01809337039b41afa9c68e62640680ba8741" exitCode=0 Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.169069 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8fa9f3126f3c5830e24df4ee034d01809337039b41afa9c68e62640680ba8741"} Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.600582 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:51:59 crc kubenswrapper[4852]: I1210 11:51:59.606032 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.173980 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.174003 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86"} Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.174037 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.174044 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77"} Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.173983 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.174467 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.174731 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.173983 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175089 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175112 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175090 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175123 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175121 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175141 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175151 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175139 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175190 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175618 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175636 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.175644 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.176106 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.176124 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.176132 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.239544 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.390293 4852 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.407585 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.546963 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.548434 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.548471 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.548484 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:00 crc kubenswrapper[4852]: I1210 11:52:00.548507 4852 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.179129 4852 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="30a236767407de8a7d0a9c4a03bd6fc3e556fcfc4077816c992c51f2f7771b1a" exitCode=0 Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.179190 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"30a236767407de8a7d0a9c4a03bd6fc3e556fcfc4077816c992c51f2f7771b1a"} Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.179767 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.179900 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.179968 4852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.179993 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.179911 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.180561 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.180589 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.180602 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181006 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181104 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181184 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181342 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181363 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181374 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181429 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181441 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.181451 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:01 crc kubenswrapper[4852]: I1210 11:52:01.344178 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.186565 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eda420e05497cf2adf27c49891984efa2f922f908eb20dbe7479735ac06a4816"} Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.186629 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.186719 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.186631 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9de04dbb70fc3883a0827e680e7e99815d060dc60fe46ea9612952038de7ec22"} Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.186788 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d883b429b5042a0abe72b9ffd96bb64cfbdbbc0b7c773fa16274b9d965ed28da"} Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.187679 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.187730 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.187742 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.188784 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.188816 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:02 crc kubenswrapper[4852]: I1210 11:52:02.188824 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.133467 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.159816 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.194427 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"98658c73eb25174f825d06c73cf3714fed30a10fc4c7a83f201a5343ade5040f"} Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.194491 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"31dbdaa8d59a029deb7dfcbd9a2a13960ca0a8f3ea23bbb793522b6e78ea3e7b"} Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.194491 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.194667 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.194736 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.195796 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.195826 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.195840 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.195891 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.195920 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.195933 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.196325 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.196582 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.196627 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:03 crc kubenswrapper[4852]: I1210 11:52:03.196644 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.198313 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.198403 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.201337 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.201345 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.201397 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.201417 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.201430 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.201441 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:04 crc kubenswrapper[4852]: E1210 11:52:04.239323 4852 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 10 11:52:04 crc kubenswrapper[4852]: I1210 11:52:04.944371 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 10 11:52:05 crc kubenswrapper[4852]: I1210 11:52:05.202342 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:05 crc kubenswrapper[4852]: I1210 11:52:05.203443 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:05 crc kubenswrapper[4852]: I1210 11:52:05.203503 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:05 crc kubenswrapper[4852]: I1210 11:52:05.203516 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:06 crc kubenswrapper[4852]: I1210 11:52:06.160796 4852 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 10 11:52:06 crc kubenswrapper[4852]: I1210 11:52:06.160876 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.121096 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.121274 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.126375 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.126766 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.126793 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.128128 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.207746 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.209312 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.209354 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:07 crc kubenswrapper[4852]: I1210 11:52:07.209366 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:09 crc kubenswrapper[4852]: W1210 11:52:09.991600 4852 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 10 11:52:09 crc kubenswrapper[4852]: I1210 11:52:09.991687 4852 trace.go:236] Trace[1366704413]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Dec-2025 11:51:59.990) (total time: 10001ms): Dec 10 11:52:09 crc kubenswrapper[4852]: Trace[1366704413]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:52:09.991) Dec 10 11:52:09 crc kubenswrapper[4852]: Trace[1366704413]: [10.001641638s] [10.001641638s] END Dec 10 11:52:09 crc kubenswrapper[4852]: E1210 11:52:09.991707 4852 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 10 11:52:10 crc kubenswrapper[4852]: I1210 11:52:10.108060 4852 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 10 11:52:10 crc kubenswrapper[4852]: E1210 11:52:10.316998 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 10 11:52:10 crc kubenswrapper[4852]: E1210 11:52:10.391749 4852 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 10 11:52:10 crc kubenswrapper[4852]: I1210 11:52:10.407654 4852 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 10 11:52:10 crc kubenswrapper[4852]: I1210 11:52:10.407788 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 10 11:52:10 crc kubenswrapper[4852]: E1210 11:52:10.549931 4852 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 10 11:52:10 crc kubenswrapper[4852]: I1210 11:52:10.593350 4852 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 10 11:52:10 crc kubenswrapper[4852]: I1210 11:52:10.593459 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 10 11:52:14 crc kubenswrapper[4852]: E1210 11:52:14.239495 4852 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 10 11:52:14 crc kubenswrapper[4852]: I1210 11:52:14.969486 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 10 11:52:14 crc kubenswrapper[4852]: I1210 11:52:14.969660 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:14 crc kubenswrapper[4852]: I1210 11:52:14.970831 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:14 crc kubenswrapper[4852]: I1210 11:52:14.970869 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:14 crc kubenswrapper[4852]: I1210 11:52:14.970881 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:14 crc kubenswrapper[4852]: I1210 11:52:14.984790 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.232204 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.233347 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.233411 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.233424 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.413771 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.414028 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.415483 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.415548 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.415563 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.419095 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.597917 4852 trace.go:236] Trace[2133347029]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Dec-2025 11:52:01.773) (total time: 13824ms): Dec 10 11:52:15 crc kubenswrapper[4852]: Trace[2133347029]: ---"Objects listed" error: 13824ms (11:52:15.597) Dec 10 11:52:15 crc kubenswrapper[4852]: Trace[2133347029]: [13.82433751s] [13.82433751s] END Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.597984 4852 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.598083 4852 trace.go:236] Trace[815304884]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Dec-2025 11:52:03.658) (total time: 11939ms): Dec 10 11:52:15 crc kubenswrapper[4852]: Trace[815304884]: ---"Objects listed" error: 11939ms (11:52:15.598) Dec 10 11:52:15 crc kubenswrapper[4852]: Trace[815304884]: [11.939119971s] [11.939119971s] END Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.598102 4852 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.598299 4852 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.598454 4852 trace.go:236] Trace[1835145521]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Dec-2025 11:52:02.269) (total time: 13329ms): Dec 10 11:52:15 crc kubenswrapper[4852]: Trace[1835145521]: ---"Objects listed" error: 13328ms (11:52:15.598) Dec 10 11:52:15 crc kubenswrapper[4852]: Trace[1835145521]: [13.329013478s] [13.329013478s] END Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.598478 4852 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.630567 4852 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45732->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.630718 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45732->192.168.126.11:17697: read: connection reset by peer" Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.631243 4852 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 10 11:52:15 crc kubenswrapper[4852]: I1210 11:52:15.631272 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.112088 4852 apiserver.go:52] "Watching apiserver" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.122400 4852 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.122733 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.123183 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.123283 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.123120 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.123567 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.123633 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.123685 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.123928 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.124170 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.124252 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.126222 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.126465 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.126470 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.126634 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.126718 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.126793 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.126837 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.127351 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.131253 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.160167 4852 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.160258 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.210926 4852 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.236579 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.238443 4852 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86" exitCode=255 Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.238483 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86"} Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549035 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549088 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549088 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549145 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549172 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549198 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549220 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549268 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549289 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549308 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549349 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549371 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549390 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549413 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549435 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549457 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549477 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549500 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549525 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549548 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549580 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549606 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549627 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549646 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549671 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549693 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549716 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549738 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549923 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.549987 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550022 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550051 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550084 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550119 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550352 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550431 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550469 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550461 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550493 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550498 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550519 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550545 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550565 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550589 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550611 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550628 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550999 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551171 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.550642 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551389 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551413 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551457 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551489 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551519 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551551 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551581 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551606 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551631 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551657 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551687 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551711 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551738 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551763 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551789 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551817 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551845 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551875 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551903 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551933 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551960 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551989 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552021 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552053 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552078 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552106 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552133 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552169 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552200 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552225 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552272 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552299 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552330 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552356 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552387 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552419 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552452 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552478 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552504 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552532 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552564 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552590 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552615 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552640 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552665 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552688 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552715 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552737 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552763 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552789 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552817 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552845 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552871 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552898 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552919 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552947 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552975 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553016 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553046 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553072 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553097 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553122 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553147 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553174 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553200 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553225 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551485 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.551947 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552362 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552347 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552697 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.552716 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553023 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553367 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553763 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.553853 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:17.053823182 +0000 UTC m=+23.139348406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578028 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578040 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.554015 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.554425 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.554452 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.554523 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.554688 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.554758 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.554789 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.554988 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.555245 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578286 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.555292 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.555718 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.555771 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.556292 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579002 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.556334 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.556871 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.556930 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.557338 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.557356 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.557529 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.557700 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.557942 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.557965 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.557973 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.558362 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.559191 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.559338 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578357 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578400 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579366 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.559433 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579405 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.559706 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.559729 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.559844 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.560261 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.560328 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.560729 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.560935 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.559449 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561062 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561111 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561454 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561509 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579640 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.560476 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561778 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561791 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579679 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561977 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579709 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579744 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579766 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579791 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579577 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579855 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579888 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579909 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579930 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.579970 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580063 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580108 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580272 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.562080 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.562350 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.562694 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.562998 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.563307 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.563309 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561566 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.562732 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.563988 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.564311 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.564934 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.565019 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.565083 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.565098 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.565339 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.566014 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.577469 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578271 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578175 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578672 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.578770 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.553972 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580493 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561546 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.561989 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580008 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580002 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580184 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580208 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580293 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580584 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580719 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.581062 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.580302 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.581360 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.581377 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.581564 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.581293 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582004 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582021 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582071 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582077 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582101 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582070 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582134 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582158 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582179 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582199 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582251 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582284 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582309 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582338 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582380 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582406 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582519 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582529 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582512 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582587 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582580 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582666 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582762 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582889 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582927 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582954 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.582976 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583058 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583088 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583147 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583208 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583149 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583251 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583297 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583320 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583345 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583365 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583425 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583555 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583719 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.610943 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613200 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613316 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613338 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613360 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613475 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613496 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613555 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613592 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613640 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613634 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613682 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.583572 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613713 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613731 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613740 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613779 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613853 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613891 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613910 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613924 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613952 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.613976 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614001 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614026 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614054 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614081 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614110 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614139 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614164 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614183 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614203 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614249 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614279 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614312 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614333 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614363 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614372 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614393 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614417 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614435 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614453 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614473 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614496 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614523 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614543 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.614885 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.615130 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.615169 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.615458 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.616989 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617241 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617273 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617298 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617539 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617577 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617733 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617778 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618008 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618047 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618080 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618111 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618162 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618205 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618252 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618288 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618334 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618362 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618432 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618458 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618680 4852 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618909 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.619181 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623698 4852 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623747 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623768 4852 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623786 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623807 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623825 4852 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624179 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624214 4852 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624251 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624266 4852 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624344 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624363 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624379 4852 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625151 4852 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625174 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625194 4852 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625211 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625247 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625264 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625280 4852 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625297 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625314 4852 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625329 4852 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625343 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625357 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625371 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625385 4852 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625401 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625414 4852 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.625428 4852 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626279 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626321 4852 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626336 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626351 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626371 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626431 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626448 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626464 4852 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626477 4852 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626491 4852 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626507 4852 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626537 4852 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626552 4852 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626566 4852 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626578 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626597 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626611 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626627 4852 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626641 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626656 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626672 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626689 4852 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626703 4852 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626717 4852 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626733 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626751 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626765 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626779 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626793 4852 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626807 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626821 4852 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626835 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626849 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626863 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626877 4852 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626890 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626902 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626917 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626933 4852 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626946 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626971 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626986 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627001 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627023 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627037 4852 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627052 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627066 4852 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627080 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627094 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627106 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627119 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627134 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627147 4852 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627160 4852 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627174 4852 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627187 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627201 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627214 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627260 4852 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627276 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627300 4852 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627316 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627331 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627342 4852 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627354 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627365 4852 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627431 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627443 4852 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627455 4852 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627467 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627478 4852 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627490 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627508 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627520 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627531 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627543 4852 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627956 4852 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627967 4852 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.627978 4852 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628012 4852 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628102 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628114 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628308 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628321 4852 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628333 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628344 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628357 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628374 4852 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628387 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628399 4852 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628412 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628423 4852 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.628434 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.615130 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.615144 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.615678 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.615904 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.615961 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.616067 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617139 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617452 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617627 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617666 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617784 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.617967 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618202 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618552 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.618955 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.619121 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.619098 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.619569 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.619792 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.620098 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.620607 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.620709 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.621048 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.621129 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.621371 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.621749 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.622444 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623405 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623333 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.623892 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624783 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624905 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.624892 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626068 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.626352 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.630325 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.630358 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.630999 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.631138 4852 scope.go:117] "RemoveContainer" containerID="49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.631599 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.631741 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.631771 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.631815 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.631862 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.631893 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.632060 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.632283 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.632426 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.632497 4852 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.632567 4852 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.632642 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:17.132606725 +0000 UTC m=+23.218132149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.632953 4852 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.633127 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.633193 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:17.133157239 +0000 UTC m=+23.218682453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.633009 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.634180 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.634604 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.635327 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.636758 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.639738 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.650277 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.650260 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.650473 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.650581 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.650708 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.650803 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.651933 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.651958 4852 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.652049 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:17.152020731 +0000 UTC m=+23.237545955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.652263 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.651090 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.652350 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.652373 4852 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.652476 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:17.152449672 +0000 UTC m=+23.237975106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.651247 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.651589 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.651795 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.652623 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.653320 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.657309 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.663724 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.663844 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.665196 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.665555 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.665798 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.666269 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.666800 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.667311 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.667954 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.674155 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.693754 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.704894 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.726435 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.729842 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.729912 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730005 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730021 4852 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730033 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730046 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730058 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730070 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730081 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730092 4852 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730103 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730114 4852 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730128 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730120 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730191 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730141 4852 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730261 4852 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730273 4852 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730284 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730296 4852 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730311 4852 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730323 4852 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730334 4852 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730345 4852 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730359 4852 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730371 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730382 4852 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730393 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730409 4852 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730422 4852 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730436 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730449 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730464 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730480 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730495 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730508 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730521 4852 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730534 4852 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730546 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730559 4852 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730572 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730584 4852 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730597 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730611 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730623 4852 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730636 4852 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730647 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730659 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730671 4852 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730682 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730694 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730706 4852 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730718 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730730 4852 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730741 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730752 4852 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730763 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730775 4852 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730786 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730797 4852 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730852 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730864 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730877 4852 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730888 4852 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730902 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730913 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730924 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730936 4852 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730947 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.730961 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.731158 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.737201 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.741626 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.751580 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.752074 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:16 crc kubenswrapper[4852]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Dec 10 11:52:16 crc kubenswrapper[4852]: if [[ -f "/env/_master" ]]; then Dec 10 11:52:16 crc kubenswrapper[4852]: set -o allexport Dec 10 11:52:16 crc kubenswrapper[4852]: source "/env/_master" Dec 10 11:52:16 crc kubenswrapper[4852]: set +o allexport Dec 10 11:52:16 crc kubenswrapper[4852]: fi Dec 10 11:52:16 crc kubenswrapper[4852]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Dec 10 11:52:16 crc kubenswrapper[4852]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Dec 10 11:52:16 crc kubenswrapper[4852]: ho_enable="--enable-hybrid-overlay" Dec 10 11:52:16 crc kubenswrapper[4852]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Dec 10 11:52:16 crc kubenswrapper[4852]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Dec 10 11:52:16 crc kubenswrapper[4852]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Dec 10 11:52:16 crc kubenswrapper[4852]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Dec 10 11:52:16 crc kubenswrapper[4852]: --webhook-cert-dir="/etc/webhook-cert" \ Dec 10 11:52:16 crc kubenswrapper[4852]: --webhook-host=127.0.0.1 \ Dec 10 11:52:16 crc kubenswrapper[4852]: --webhook-port=9743 \ Dec 10 11:52:16 crc kubenswrapper[4852]: ${ho_enable} \ Dec 10 11:52:16 crc kubenswrapper[4852]: --enable-interconnect \ Dec 10 11:52:16 crc kubenswrapper[4852]: --disable-approver \ Dec 10 11:52:16 crc kubenswrapper[4852]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Dec 10 11:52:16 crc kubenswrapper[4852]: --wait-for-kubernetes-api=200s \ Dec 10 11:52:16 crc kubenswrapper[4852]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Dec 10 11:52:16 crc kubenswrapper[4852]: --loglevel="${LOGLEVEL}" Dec 10 11:52:16 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:16 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.755833 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.756579 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:16 crc kubenswrapper[4852]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Dec 10 11:52:16 crc kubenswrapper[4852]: if [[ -f "/env/_master" ]]; then Dec 10 11:52:16 crc kubenswrapper[4852]: set -o allexport Dec 10 11:52:16 crc kubenswrapper[4852]: source "/env/_master" Dec 10 11:52:16 crc kubenswrapper[4852]: set +o allexport Dec 10 11:52:16 crc kubenswrapper[4852]: fi Dec 10 11:52:16 crc kubenswrapper[4852]: Dec 10 11:52:16 crc kubenswrapper[4852]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Dec 10 11:52:16 crc kubenswrapper[4852]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Dec 10 11:52:16 crc kubenswrapper[4852]: --disable-webhook \ Dec 10 11:52:16 crc kubenswrapper[4852]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Dec 10 11:52:16 crc kubenswrapper[4852]: --loglevel="${LOGLEVEL}" Dec 10 11:52:16 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:16 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.759000 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Dec 10 11:52:16 crc kubenswrapper[4852]: W1210 11:52:16.811202 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-94aca46a1df9921f3030b32f5f84df21559e30ed3514c5ce46d5396d568382c5 WatchSource:0}: Error finding container 94aca46a1df9921f3030b32f5f84df21559e30ed3514c5ce46d5396d568382c5: Status 404 returned error can't find the container with id 94aca46a1df9921f3030b32f5f84df21559e30ed3514c5ce46d5396d568382c5 Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.814164 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.815327 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.817514 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:16 crc kubenswrapper[4852]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Dec 10 11:52:16 crc kubenswrapper[4852]: set -o allexport Dec 10 11:52:16 crc kubenswrapper[4852]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Dec 10 11:52:16 crc kubenswrapper[4852]: source /etc/kubernetes/apiserver-url.env Dec 10 11:52:16 crc kubenswrapper[4852]: else Dec 10 11:52:16 crc kubenswrapper[4852]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Dec 10 11:52:16 crc kubenswrapper[4852]: exit 1 Dec 10 11:52:16 crc kubenswrapper[4852]: fi Dec 10 11:52:16 crc kubenswrapper[4852]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Dec 10 11:52:16 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:16 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.820104 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.831652 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.831692 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.950903 4852 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.953653 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.953738 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.953753 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.953847 4852 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.967622 4852 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.967925 4852 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.969439 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.969477 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.969488 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.969511 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:16 crc kubenswrapper[4852]: I1210 11:52:16.969536 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:16Z","lastTransitionTime":"2025-12-10T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:16 crc kubenswrapper[4852]: E1210 11:52:16.993310 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.005721 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.005815 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.005835 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.005871 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.005887 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.021528 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.026361 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.026438 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.026452 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.026516 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.026531 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.038096 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.043996 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.044312 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.044425 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.044533 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.044665 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.056120 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.060835 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.060874 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.060885 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.060905 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.060917 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.072989 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.073107 4852 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.074531 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.074549 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.074558 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.074571 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.074580 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.134120 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.134224 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.134276 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.134406 4852 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.134466 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:18.134448265 +0000 UTC m=+24.219973489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.134847 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:18.134836605 +0000 UTC m=+24.220361829 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.134897 4852 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.134950 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:18.134920957 +0000 UTC m=+24.220446181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.176623 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.176658 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.176669 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.176686 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.176697 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.235676 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.235734 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.235882 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.235902 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.235915 4852 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.235970 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:18.235953138 +0000 UTC m=+24.321478362 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.236393 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.236413 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.236423 4852 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.236453 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:18.23644409 +0000 UTC m=+24.321969314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.241960 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"94aca46a1df9921f3030b32f5f84df21559e30ed3514c5ce46d5396d568382c5"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.243256 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"93cd429b3ac682f91f5132eb472397856a580c192bb8e998bff60a5ccbe7688d"} Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.243498 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.244567 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:17 crc kubenswrapper[4852]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Dec 10 11:52:17 crc kubenswrapper[4852]: if [[ -f "/env/_master" ]]; then Dec 10 11:52:17 crc kubenswrapper[4852]: set -o allexport Dec 10 11:52:17 crc kubenswrapper[4852]: source "/env/_master" Dec 10 11:52:17 crc kubenswrapper[4852]: set +o allexport Dec 10 11:52:17 crc kubenswrapper[4852]: fi Dec 10 11:52:17 crc kubenswrapper[4852]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Dec 10 11:52:17 crc kubenswrapper[4852]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Dec 10 11:52:17 crc kubenswrapper[4852]: ho_enable="--enable-hybrid-overlay" Dec 10 11:52:17 crc kubenswrapper[4852]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Dec 10 11:52:17 crc kubenswrapper[4852]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Dec 10 11:52:17 crc kubenswrapper[4852]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Dec 10 11:52:17 crc kubenswrapper[4852]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Dec 10 11:52:17 crc kubenswrapper[4852]: --webhook-cert-dir="/etc/webhook-cert" \ Dec 10 11:52:17 crc kubenswrapper[4852]: --webhook-host=127.0.0.1 \ Dec 10 11:52:17 crc kubenswrapper[4852]: --webhook-port=9743 \ Dec 10 11:52:17 crc kubenswrapper[4852]: ${ho_enable} \ Dec 10 11:52:17 crc kubenswrapper[4852]: --enable-interconnect \ Dec 10 11:52:17 crc kubenswrapper[4852]: --disable-approver \ Dec 10 11:52:17 crc kubenswrapper[4852]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Dec 10 11:52:17 crc kubenswrapper[4852]: --wait-for-kubernetes-api=200s \ Dec 10 11:52:17 crc kubenswrapper[4852]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Dec 10 11:52:17 crc kubenswrapper[4852]: --loglevel="${LOGLEVEL}" Dec 10 11:52:17 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:17 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.244736 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.245207 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.246574 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:17 crc kubenswrapper[4852]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Dec 10 11:52:17 crc kubenswrapper[4852]: if [[ -f "/env/_master" ]]; then Dec 10 11:52:17 crc kubenswrapper[4852]: set -o allexport Dec 10 11:52:17 crc kubenswrapper[4852]: source "/env/_master" Dec 10 11:52:17 crc kubenswrapper[4852]: set +o allexport Dec 10 11:52:17 crc kubenswrapper[4852]: fi Dec 10 11:52:17 crc kubenswrapper[4852]: Dec 10 11:52:17 crc kubenswrapper[4852]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Dec 10 11:52:17 crc kubenswrapper[4852]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Dec 10 11:52:17 crc kubenswrapper[4852]: --disable-webhook \ Dec 10 11:52:17 crc kubenswrapper[4852]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Dec 10 11:52:17 crc kubenswrapper[4852]: --loglevel="${LOGLEVEL}" Dec 10 11:52:17 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:17 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.247779 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c"} Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.247981 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.248039 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.248930 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3d623a82ac8f3ff2a0f4ef971a86ac862f424516c2151faa6012b3a53c122af9"} Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.250606 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:17 crc kubenswrapper[4852]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Dec 10 11:52:17 crc kubenswrapper[4852]: set -o allexport Dec 10 11:52:17 crc kubenswrapper[4852]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Dec 10 11:52:17 crc kubenswrapper[4852]: source /etc/kubernetes/apiserver-url.env Dec 10 11:52:17 crc kubenswrapper[4852]: else Dec 10 11:52:17 crc kubenswrapper[4852]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Dec 10 11:52:17 crc kubenswrapper[4852]: exit 1 Dec 10 11:52:17 crc kubenswrapper[4852]: fi Dec 10 11:52:17 crc kubenswrapper[4852]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Dec 10 11:52:17 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:17 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:17 crc kubenswrapper[4852]: E1210 11:52:17.251811 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.279016 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.279096 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.279110 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.279184 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.279202 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.290695 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.323349 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.344083 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.365717 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.382125 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.382179 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.382193 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.382212 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.382264 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.387107 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.401694 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.421268 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.433086 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.451270 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.467627 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.479092 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.484343 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.484385 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.484394 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.484410 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.484419 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.494678 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.510901 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.544985 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.587007 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.587053 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.587065 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.587082 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.587094 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.689355 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.689392 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.689401 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.689416 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.689425 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.706529 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qbbd2"] Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.706915 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qbbd2" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.707142 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mzcx9"] Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.707596 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.708925 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-thqgh"] Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.709348 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.710200 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.710301 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.710511 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.711365 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.711652 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.711815 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.711996 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.712058 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.712340 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.712392 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.712453 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.712783 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.712794 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.722774 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.735133 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.751390 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.764824 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.775290 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.788064 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.791774 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.791817 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.791831 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.791860 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.791883 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.801069 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.812532 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.821180 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.837068 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.841536 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-netns\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.841586 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-hostroot\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.841616 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cscwq\" (UniqueName: \"kubernetes.io/projected/06184023-d738-4d23-ae7e-bc0dde135fa2-kube-api-access-cscwq\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.841640 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d01ef2d-58af-42c3-b716-9020614e2a09-cni-binary-copy\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.841795 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-conf-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.841877 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-etc-kubernetes\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.841911 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56z4\" (UniqueName: \"kubernetes.io/projected/2fa2206a-32c7-4bcc-8899-9bb9742ba9fd-kube-api-access-d56z4\") pod \"node-resolver-qbbd2\" (UID: \"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\") " pod="openshift-dns/node-resolver-qbbd2" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.841981 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06184023-d738-4d23-ae7e-bc0dde135fa2-proxy-tls\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842027 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-cnibin\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842056 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-k8s-cni-cncf-io\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842083 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-cni-bin\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842107 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2fa2206a-32c7-4bcc-8899-9bb9742ba9fd-hosts-file\") pod \"node-resolver-qbbd2\" (UID: \"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\") " pod="openshift-dns/node-resolver-qbbd2" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842130 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-cni-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842155 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-os-release\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842176 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-socket-dir-parent\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842207 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-cni-multus\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842289 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/06184023-d738-4d23-ae7e-bc0dde135fa2-rootfs\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842323 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-system-cni-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842351 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-kubelet\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842388 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-daemon-config\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842419 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrh7b\" (UniqueName: \"kubernetes.io/projected/7d01ef2d-58af-42c3-b716-9020614e2a09-kube-api-access-qrh7b\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842448 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-multus-certs\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.842489 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06184023-d738-4d23-ae7e-bc0dde135fa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.852737 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.871827 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.882997 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.893087 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.894106 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.894175 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.894190 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.894215 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.894246 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.917051 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.936364 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943618 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-hostroot\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943672 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-netns\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943699 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cscwq\" (UniqueName: \"kubernetes.io/projected/06184023-d738-4d23-ae7e-bc0dde135fa2-kube-api-access-cscwq\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943725 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d01ef2d-58af-42c3-b716-9020614e2a09-cni-binary-copy\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943751 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-conf-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943804 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-netns\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943851 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-conf-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943874 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-etc-kubernetes\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943776 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-etc-kubernetes\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943920 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d56z4\" (UniqueName: \"kubernetes.io/projected/2fa2206a-32c7-4bcc-8899-9bb9742ba9fd-kube-api-access-d56z4\") pod \"node-resolver-qbbd2\" (UID: \"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\") " pod="openshift-dns/node-resolver-qbbd2" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943955 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06184023-d738-4d23-ae7e-bc0dde135fa2-proxy-tls\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943979 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-cnibin\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944001 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-k8s-cni-cncf-io\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944027 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-cni-bin\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944047 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-cni-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944068 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-os-release\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944093 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-socket-dir-parent\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944118 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-cni-multus\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944122 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-k8s-cni-cncf-io\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944139 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2fa2206a-32c7-4bcc-8899-9bb9742ba9fd-hosts-file\") pod \"node-resolver-qbbd2\" (UID: \"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\") " pod="openshift-dns/node-resolver-qbbd2" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944172 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/06184023-d738-4d23-ae7e-bc0dde135fa2-rootfs\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944195 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-system-cni-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944219 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-kubelet\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944281 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-daemon-config\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944308 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrh7b\" (UniqueName: \"kubernetes.io/projected/7d01ef2d-58af-42c3-b716-9020614e2a09-kube-api-access-qrh7b\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944330 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-cni-multus\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944343 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06184023-d738-4d23-ae7e-bc0dde135fa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944369 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-multus-certs\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944377 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-cni-bin\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944423 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-run-multus-certs\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944493 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2fa2206a-32c7-4bcc-8899-9bb9742ba9fd-hosts-file\") pod \"node-resolver-qbbd2\" (UID: \"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\") " pod="openshift-dns/node-resolver-qbbd2" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944529 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/06184023-d738-4d23-ae7e-bc0dde135fa2-rootfs\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944549 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-cnibin\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944563 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-cni-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944570 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-system-cni-dir\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944653 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-os-release\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944706 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-socket-dir-parent\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944749 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-host-var-lib-kubelet\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.943933 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d01ef2d-58af-42c3-b716-9020614e2a09-hostroot\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.944853 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d01ef2d-58af-42c3-b716-9020614e2a09-cni-binary-copy\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.945143 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d01ef2d-58af-42c3-b716-9020614e2a09-multus-daemon-config\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.945822 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06184023-d738-4d23-ae7e-bc0dde135fa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.948424 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06184023-d738-4d23-ae7e-bc0dde135fa2-proxy-tls\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.961981 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.975316 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56z4\" (UniqueName: \"kubernetes.io/projected/2fa2206a-32c7-4bcc-8899-9bb9742ba9fd-kube-api-access-d56z4\") pod \"node-resolver-qbbd2\" (UID: \"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\") " pod="openshift-dns/node-resolver-qbbd2" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.980993 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrh7b\" (UniqueName: \"kubernetes.io/projected/7d01ef2d-58af-42c3-b716-9020614e2a09-kube-api-access-qrh7b\") pod \"multus-mzcx9\" (UID: \"7d01ef2d-58af-42c3-b716-9020614e2a09\") " pod="openshift-multus/multus-mzcx9" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.985026 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cscwq\" (UniqueName: \"kubernetes.io/projected/06184023-d738-4d23-ae7e-bc0dde135fa2-kube-api-access-cscwq\") pod \"machine-config-daemon-thqgh\" (UID: \"06184023-d738-4d23-ae7e-bc0dde135fa2\") " pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.988408 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.997190 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.997253 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.997267 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.997287 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:17 crc kubenswrapper[4852]: I1210 11:52:17.997302 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:17Z","lastTransitionTime":"2025-12-10T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.021494 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qbbd2" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.031438 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mzcx9" Dec 10 11:52:18 crc kubenswrapper[4852]: W1210 11:52:18.031949 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa2206a_32c7_4bcc_8899_9bb9742ba9fd.slice/crio-72b36f6a15e34c5fe83872e71daf4fcadf752a0a0cfe00edbe5b0bdeb5efe9e0 WatchSource:0}: Error finding container 72b36f6a15e34c5fe83872e71daf4fcadf752a0a0cfe00edbe5b0bdeb5efe9e0: Status 404 returned error can't find the container with id 72b36f6a15e34c5fe83872e71daf4fcadf752a0a0cfe00edbe5b0bdeb5efe9e0 Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.034151 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:18 crc kubenswrapper[4852]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Dec 10 11:52:18 crc kubenswrapper[4852]: set -uo pipefail Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Dec 10 11:52:18 crc kubenswrapper[4852]: HOSTS_FILE="/etc/hosts" Dec 10 11:52:18 crc kubenswrapper[4852]: TEMP_FILE="/etc/hosts.tmp" Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: IFS=', ' read -r -a services <<< "${SERVICES}" Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: # Make a temporary file with the old hosts file's attributes. Dec 10 11:52:18 crc kubenswrapper[4852]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Dec 10 11:52:18 crc kubenswrapper[4852]: echo "Failed to preserve hosts file. Exiting." Dec 10 11:52:18 crc kubenswrapper[4852]: exit 1 Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: while true; do Dec 10 11:52:18 crc kubenswrapper[4852]: declare -A svc_ips Dec 10 11:52:18 crc kubenswrapper[4852]: for svc in "${services[@]}"; do Dec 10 11:52:18 crc kubenswrapper[4852]: # Fetch service IP from cluster dns if present. We make several tries Dec 10 11:52:18 crc kubenswrapper[4852]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Dec 10 11:52:18 crc kubenswrapper[4852]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Dec 10 11:52:18 crc kubenswrapper[4852]: # support UDP loadbalancers and require reaching DNS through TCP. Dec 10 11:52:18 crc kubenswrapper[4852]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Dec 10 11:52:18 crc kubenswrapper[4852]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Dec 10 11:52:18 crc kubenswrapper[4852]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Dec 10 11:52:18 crc kubenswrapper[4852]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Dec 10 11:52:18 crc kubenswrapper[4852]: for i in ${!cmds[*]} Dec 10 11:52:18 crc kubenswrapper[4852]: do Dec 10 11:52:18 crc kubenswrapper[4852]: ips=($(eval "${cmds[i]}")) Dec 10 11:52:18 crc kubenswrapper[4852]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Dec 10 11:52:18 crc kubenswrapper[4852]: svc_ips["${svc}"]="${ips[@]}" Dec 10 11:52:18 crc kubenswrapper[4852]: break Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: # Update /etc/hosts only if we get valid service IPs Dec 10 11:52:18 crc kubenswrapper[4852]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Dec 10 11:52:18 crc kubenswrapper[4852]: # Stale entries could exist in /etc/hosts if the service is deleted Dec 10 11:52:18 crc kubenswrapper[4852]: if [[ -n "${svc_ips[*]-}" ]]; then Dec 10 11:52:18 crc kubenswrapper[4852]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Dec 10 11:52:18 crc kubenswrapper[4852]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Dec 10 11:52:18 crc kubenswrapper[4852]: # Only continue rebuilding the hosts entries if its original content is preserved Dec 10 11:52:18 crc kubenswrapper[4852]: sleep 60 & wait Dec 10 11:52:18 crc kubenswrapper[4852]: continue Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: # Append resolver entries for services Dec 10 11:52:18 crc kubenswrapper[4852]: rc=0 Dec 10 11:52:18 crc kubenswrapper[4852]: for svc in "${!svc_ips[@]}"; do Dec 10 11:52:18 crc kubenswrapper[4852]: for ip in ${svc_ips[${svc}]}; do Dec 10 11:52:18 crc kubenswrapper[4852]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: if [[ $rc -ne 0 ]]; then Dec 10 11:52:18 crc kubenswrapper[4852]: sleep 60 & wait Dec 10 11:52:18 crc kubenswrapper[4852]: continue Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Dec 10 11:52:18 crc kubenswrapper[4852]: # Replace /etc/hosts with our modified version if needed Dec 10 11:52:18 crc kubenswrapper[4852]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Dec 10 11:52:18 crc kubenswrapper[4852]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: sleep 60 & wait Dec 10 11:52:18 crc kubenswrapper[4852]: unset svc_ips Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d56z4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-qbbd2_openshift-dns(2fa2206a-32c7-4bcc-8899-9bb9742ba9fd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:18 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.035277 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-qbbd2" podUID="2fa2206a-32c7-4bcc-8899-9bb9742ba9fd" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.038181 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:52:18 crc kubenswrapper[4852]: W1210 11:52:18.045347 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d01ef2d_58af_42c3_b716_9020614e2a09.slice/crio-e588e56cd887e2b503d07843a75c249a7a72f8739a35f84ea352f9eca0a2935f WatchSource:0}: Error finding container e588e56cd887e2b503d07843a75c249a7a72f8739a35f84ea352f9eca0a2935f: Status 404 returned error can't find the container with id e588e56cd887e2b503d07843a75c249a7a72f8739a35f84ea352f9eca0a2935f Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.047994 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:18 crc kubenswrapper[4852]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Dec 10 11:52:18 crc kubenswrapper[4852]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Dec 10 11:52:18 crc kubenswrapper[4852]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrh7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-mzcx9_openshift-multus(7d01ef2d-58af-42c3-b716-9020614e2a09): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:18 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.049206 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-mzcx9" podUID="7d01ef2d-58af-42c3-b716-9020614e2a09" Dec 10 11:52:18 crc kubenswrapper[4852]: W1210 11:52:18.055871 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06184023_d738_4d23_ae7e_bc0dde135fa2.slice/crio-46ed71118a26fbd64943ad38ed12d0d648c492dfbc2825a19489d189687e0de7 WatchSource:0}: Error finding container 46ed71118a26fbd64943ad38ed12d0d648c492dfbc2825a19489d189687e0de7: Status 404 returned error can't find the container with id 46ed71118a26fbd64943ad38ed12d0d648c492dfbc2825a19489d189687e0de7 Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.059313 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cscwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.061585 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cscwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.062679 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.091753 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dfbl6"] Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.092557 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.095827 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.096022 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.099668 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.099696 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.099706 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.099722 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.099734 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.119950 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.132294 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.145836 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.145976 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.146003 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.146350 4852 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.146407 4852 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.146849 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:20.146188841 +0000 UTC m=+26.231714145 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.146893 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:20.146881638 +0000 UTC m=+26.232407062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.146910 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:20.146901469 +0000 UTC m=+26.232426913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.156114 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.169272 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.169373 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.169409 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.169364 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.169371 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.169515 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.169652 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.173240 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.173772 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.174981 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.175591 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.176541 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.177063 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.177643 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.178725 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.179427 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.180482 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.181071 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.182396 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.182987 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.183676 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.183726 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.185087 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.185730 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.186900 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.187563 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.188223 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.189458 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.189925 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.191090 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.191721 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.192949 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.193445 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.194185 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.195052 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.195632 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.196191 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.197478 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.198044 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.199057 4852 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.199179 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.201131 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.201755 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.201879 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.201972 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.202068 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.202143 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.202106 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.203003 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.205174 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.205985 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.207094 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.207892 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.209076 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.209652 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.210398 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.210754 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.211433 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.212651 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.213219 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.214403 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.214993 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.216354 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.216928 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.218200 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.218668 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.219579 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.220128 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.220681 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.223852 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.236425 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247038 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-system-cni-dir\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247098 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247136 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-cnibin\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247184 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-os-release\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247208 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4wn\" (UniqueName: \"kubernetes.io/projected/94b935ad-e468-4e03-9bfa-855973944f74-kube-api-access-cm4wn\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247246 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94b935ad-e468-4e03-9bfa-855973944f74-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247271 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94b935ad-e468-4e03-9bfa-855973944f74-cni-binary-copy\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247301 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.247330 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.247485 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.247516 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.247529 4852 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.247573 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:20.24756008 +0000 UTC m=+26.333085304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.247770 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.247851 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.247909 4852 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.248065 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:20.248044433 +0000 UTC m=+26.333569707 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.250535 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.259905 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qbbd2" event={"ID":"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd","Type":"ContainerStarted","Data":"72b36f6a15e34c5fe83872e71daf4fcadf752a0a0cfe00edbe5b0bdeb5efe9e0"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.262113 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzcx9" event={"ID":"7d01ef2d-58af-42c3-b716-9020614e2a09","Type":"ContainerStarted","Data":"e588e56cd887e2b503d07843a75c249a7a72f8739a35f84ea352f9eca0a2935f"} Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.262976 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:18 crc kubenswrapper[4852]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Dec 10 11:52:18 crc kubenswrapper[4852]: set -uo pipefail Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Dec 10 11:52:18 crc kubenswrapper[4852]: HOSTS_FILE="/etc/hosts" Dec 10 11:52:18 crc kubenswrapper[4852]: TEMP_FILE="/etc/hosts.tmp" Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: IFS=', ' read -r -a services <<< "${SERVICES}" Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: # Make a temporary file with the old hosts file's attributes. Dec 10 11:52:18 crc kubenswrapper[4852]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Dec 10 11:52:18 crc kubenswrapper[4852]: echo "Failed to preserve hosts file. Exiting." Dec 10 11:52:18 crc kubenswrapper[4852]: exit 1 Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: while true; do Dec 10 11:52:18 crc kubenswrapper[4852]: declare -A svc_ips Dec 10 11:52:18 crc kubenswrapper[4852]: for svc in "${services[@]}"; do Dec 10 11:52:18 crc kubenswrapper[4852]: # Fetch service IP from cluster dns if present. We make several tries Dec 10 11:52:18 crc kubenswrapper[4852]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Dec 10 11:52:18 crc kubenswrapper[4852]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Dec 10 11:52:18 crc kubenswrapper[4852]: # support UDP loadbalancers and require reaching DNS through TCP. Dec 10 11:52:18 crc kubenswrapper[4852]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Dec 10 11:52:18 crc kubenswrapper[4852]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Dec 10 11:52:18 crc kubenswrapper[4852]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Dec 10 11:52:18 crc kubenswrapper[4852]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Dec 10 11:52:18 crc kubenswrapper[4852]: for i in ${!cmds[*]} Dec 10 11:52:18 crc kubenswrapper[4852]: do Dec 10 11:52:18 crc kubenswrapper[4852]: ips=($(eval "${cmds[i]}")) Dec 10 11:52:18 crc kubenswrapper[4852]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Dec 10 11:52:18 crc kubenswrapper[4852]: svc_ips["${svc}"]="${ips[@]}" Dec 10 11:52:18 crc kubenswrapper[4852]: break Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: # Update /etc/hosts only if we get valid service IPs Dec 10 11:52:18 crc kubenswrapper[4852]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Dec 10 11:52:18 crc kubenswrapper[4852]: # Stale entries could exist in /etc/hosts if the service is deleted Dec 10 11:52:18 crc kubenswrapper[4852]: if [[ -n "${svc_ips[*]-}" ]]; then Dec 10 11:52:18 crc kubenswrapper[4852]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Dec 10 11:52:18 crc kubenswrapper[4852]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Dec 10 11:52:18 crc kubenswrapper[4852]: # Only continue rebuilding the hosts entries if its original content is preserved Dec 10 11:52:18 crc kubenswrapper[4852]: sleep 60 & wait Dec 10 11:52:18 crc kubenswrapper[4852]: continue Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: # Append resolver entries for services Dec 10 11:52:18 crc kubenswrapper[4852]: rc=0 Dec 10 11:52:18 crc kubenswrapper[4852]: for svc in "${!svc_ips[@]}"; do Dec 10 11:52:18 crc kubenswrapper[4852]: for ip in ${svc_ips[${svc}]}; do Dec 10 11:52:18 crc kubenswrapper[4852]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: if [[ $rc -ne 0 ]]; then Dec 10 11:52:18 crc kubenswrapper[4852]: sleep 60 & wait Dec 10 11:52:18 crc kubenswrapper[4852]: continue Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: Dec 10 11:52:18 crc kubenswrapper[4852]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Dec 10 11:52:18 crc kubenswrapper[4852]: # Replace /etc/hosts with our modified version if needed Dec 10 11:52:18 crc kubenswrapper[4852]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Dec 10 11:52:18 crc kubenswrapper[4852]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Dec 10 11:52:18 crc kubenswrapper[4852]: fi Dec 10 11:52:18 crc kubenswrapper[4852]: sleep 60 & wait Dec 10 11:52:18 crc kubenswrapper[4852]: unset svc_ips Dec 10 11:52:18 crc kubenswrapper[4852]: done Dec 10 11:52:18 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d56z4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-qbbd2_openshift-dns(2fa2206a-32c7-4bcc-8899-9bb9742ba9fd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:18 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.264050 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"46ed71118a26fbd64943ad38ed12d0d648c492dfbc2825a19489d189687e0de7"} Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.264160 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-qbbd2" podUID="2fa2206a-32c7-4bcc-8899-9bb9742ba9fd" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.265214 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:18 crc kubenswrapper[4852]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Dec 10 11:52:18 crc kubenswrapper[4852]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Dec 10 11:52:18 crc kubenswrapper[4852]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrh7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-mzcx9_openshift-multus(7d01ef2d-58af-42c3-b716-9020614e2a09): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:18 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.265324 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cscwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.270326 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-mzcx9" podUID="7d01ef2d-58af-42c3-b716-9020614e2a09" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.271446 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cscwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.272605 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.274083 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.293438 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.304103 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.304732 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.304877 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.304948 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.305018 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.305083 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.314920 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348181 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-system-cni-dir\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348299 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-cnibin\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348350 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-os-release\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348392 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4wn\" (UniqueName: \"kubernetes.io/projected/94b935ad-e468-4e03-9bfa-855973944f74-kube-api-access-cm4wn\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348414 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94b935ad-e468-4e03-9bfa-855973944f74-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348434 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94b935ad-e468-4e03-9bfa-855973944f74-cni-binary-copy\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348454 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348739 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-system-cni-dir\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.348959 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.349051 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-os-release\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.349850 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94b935ad-e468-4e03-9bfa-855973944f74-cnibin\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.350534 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94b935ad-e468-4e03-9bfa-855973944f74-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.350649 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94b935ad-e468-4e03-9bfa-855973944f74-cni-binary-copy\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.357791 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.380167 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.385878 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4wn\" (UniqueName: \"kubernetes.io/projected/94b935ad-e468-4e03-9bfa-855973944f74-kube-api-access-cm4wn\") pod \"multus-additional-cni-plugins-dfbl6\" (UID: \"94b935ad-e468-4e03-9bfa-855973944f74\") " pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.403585 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.405294 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.407076 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.407105 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.407116 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.407133 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.407145 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: W1210 11:52:18.414562 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b935ad_e468_4e03_9bfa_855973944f74.slice/crio-e9f35ed333132049c4fcb0fad611d6ac3c84145a1266ae1c4a3ff2b9f2e5ab6b WatchSource:0}: Error finding container e9f35ed333132049c4fcb0fad611d6ac3c84145a1266ae1c4a3ff2b9f2e5ab6b: Status 404 returned error can't find the container with id e9f35ed333132049c4fcb0fad611d6ac3c84145a1266ae1c4a3ff2b9f2e5ab6b Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.417585 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cm4wn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dfbl6_openshift-multus(94b935ad-e468-4e03-9bfa-855973944f74): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.418792 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" podUID="94b935ad-e468-4e03-9bfa-855973944f74" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.423846 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.435896 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.444902 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.454671 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.465417 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-89m87"] Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.465562 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.466247 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.468649 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.468997 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.469170 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.469723 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.469752 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.470119 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.470172 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.479401 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.487826 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.500933 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.508770 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.508813 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.508824 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.508851 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.508863 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.512808 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.522452 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.532981 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.544555 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550477 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-slash\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550509 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550531 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-bin\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550547 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-systemd-units\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550562 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-ovn\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550632 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-config\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550755 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-systemd\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550791 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-kubelet\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550807 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-node-log\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550821 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-script-lib\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550894 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-var-lib-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550910 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550964 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-netns\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550979 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-log-socket\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.550991 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-ovn-kubernetes\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.551004 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-etc-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.551027 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-env-overrides\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.551041 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovn-node-metrics-cert\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.551087 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-netd\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.551150 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgtq\" (UniqueName: \"kubernetes.io/projected/64c17726-4529-4a16-9d1e-e7e40fa6055a-kube-api-access-9qgtq\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.557810 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.565505 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.582525 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.596115 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.607413 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.611109 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.611144 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.611157 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.611174 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.611187 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.652664 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-netd\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.652725 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgtq\" (UniqueName: \"kubernetes.io/projected/64c17726-4529-4a16-9d1e-e7e40fa6055a-kube-api-access-9qgtq\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.652751 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-slash\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.652805 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.652824 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-bin\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.652914 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.652943 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-slash\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.652978 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-systemd-units\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.653000 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-config\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.653028 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-systemd-units\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.653043 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-bin\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.653069 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-netd\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.653141 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-ovn\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.653819 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-config\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.653870 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-ovn\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.653960 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-systemd\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654044 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-kubelet\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654066 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-node-log\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654083 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-script-lib\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654132 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-kubelet\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654018 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-systemd\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654138 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-node-log\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654188 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-var-lib-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654215 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654243 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-var-lib-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654255 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-log-socket\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654274 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-ovn-kubernetes\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654282 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654291 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-netns\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654311 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-etc-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654315 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-log-socket\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654345 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-env-overrides\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654344 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-ovn-kubernetes\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654368 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-etc-openvswitch\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654372 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovn-node-metrics-cert\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654347 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-netns\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.654907 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-script-lib\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.655018 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-env-overrides\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.658479 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovn-node-metrics-cert\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.674886 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgtq\" (UniqueName: \"kubernetes.io/projected/64c17726-4529-4a16-9d1e-e7e40fa6055a-kube-api-access-9qgtq\") pod \"ovnkube-node-89m87\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.713648 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.713688 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.713699 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.713714 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.713726 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.778319 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:18 crc kubenswrapper[4852]: W1210 11:52:18.802899 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64c17726_4529_4a16_9d1e_e7e40fa6055a.slice/crio-483383519e1fc6e0525f78e02db44a8c3028228410073467d5ced2664f093db6 WatchSource:0}: Error finding container 483383519e1fc6e0525f78e02db44a8c3028228410073467d5ced2664f093db6: Status 404 returned error can't find the container with id 483383519e1fc6e0525f78e02db44a8c3028228410073467d5ced2664f093db6 Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.805591 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:18 crc kubenswrapper[4852]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Dec 10 11:52:18 crc kubenswrapper[4852]: apiVersion: v1 Dec 10 11:52:18 crc kubenswrapper[4852]: clusters: Dec 10 11:52:18 crc kubenswrapper[4852]: - cluster: Dec 10 11:52:18 crc kubenswrapper[4852]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Dec 10 11:52:18 crc kubenswrapper[4852]: server: https://api-int.crc.testing:6443 Dec 10 11:52:18 crc kubenswrapper[4852]: name: default-cluster Dec 10 11:52:18 crc kubenswrapper[4852]: contexts: Dec 10 11:52:18 crc kubenswrapper[4852]: - context: Dec 10 11:52:18 crc kubenswrapper[4852]: cluster: default-cluster Dec 10 11:52:18 crc kubenswrapper[4852]: namespace: default Dec 10 11:52:18 crc kubenswrapper[4852]: user: default-auth Dec 10 11:52:18 crc kubenswrapper[4852]: name: default-context Dec 10 11:52:18 crc kubenswrapper[4852]: current-context: default-context Dec 10 11:52:18 crc kubenswrapper[4852]: kind: Config Dec 10 11:52:18 crc kubenswrapper[4852]: preferences: {} Dec 10 11:52:18 crc kubenswrapper[4852]: users: Dec 10 11:52:18 crc kubenswrapper[4852]: - name: default-auth Dec 10 11:52:18 crc kubenswrapper[4852]: user: Dec 10 11:52:18 crc kubenswrapper[4852]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Dec 10 11:52:18 crc kubenswrapper[4852]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Dec 10 11:52:18 crc kubenswrapper[4852]: EOF Dec 10 11:52:18 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qgtq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-89m87_openshift-ovn-kubernetes(64c17726-4529-4a16-9d1e-e7e40fa6055a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:18 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:18 crc kubenswrapper[4852]: E1210 11:52:18.806759 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.816430 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.816481 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.816495 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.816515 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.816529 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.919127 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.919167 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.919179 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.919199 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:18 crc kubenswrapper[4852]: I1210 11:52:18.919211 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:18Z","lastTransitionTime":"2025-12-10T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.021751 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.021781 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.021791 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.021804 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.021814 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.060336 4852 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.073044 4852 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.104411 4852 csr.go:261] certificate signing request csr-mhg85 is approved, waiting to be issued Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.123561 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.123898 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.124014 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.124139 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.124267 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.124552 4852 csr.go:257] certificate signing request csr-mhg85 is issued Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.227069 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.227358 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.227435 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.227513 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.227585 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.267581 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"483383519e1fc6e0525f78e02db44a8c3028228410073467d5ced2664f093db6"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.268503 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerStarted","Data":"e9f35ed333132049c4fcb0fad611d6ac3c84145a1266ae1c4a3ff2b9f2e5ab6b"} Dec 10 11:52:19 crc kubenswrapper[4852]: E1210 11:52:19.270387 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cm4wn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dfbl6_openshift-multus(94b935ad-e468-4e03-9bfa-855973944f74): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 10 11:52:19 crc kubenswrapper[4852]: E1210 11:52:19.270981 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 11:52:19 crc kubenswrapper[4852]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Dec 10 11:52:19 crc kubenswrapper[4852]: apiVersion: v1 Dec 10 11:52:19 crc kubenswrapper[4852]: clusters: Dec 10 11:52:19 crc kubenswrapper[4852]: - cluster: Dec 10 11:52:19 crc kubenswrapper[4852]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Dec 10 11:52:19 crc kubenswrapper[4852]: server: https://api-int.crc.testing:6443 Dec 10 11:52:19 crc kubenswrapper[4852]: name: default-cluster Dec 10 11:52:19 crc kubenswrapper[4852]: contexts: Dec 10 11:52:19 crc kubenswrapper[4852]: - context: Dec 10 11:52:19 crc kubenswrapper[4852]: cluster: default-cluster Dec 10 11:52:19 crc kubenswrapper[4852]: namespace: default Dec 10 11:52:19 crc kubenswrapper[4852]: user: default-auth Dec 10 11:52:19 crc kubenswrapper[4852]: name: default-context Dec 10 11:52:19 crc kubenswrapper[4852]: current-context: default-context Dec 10 11:52:19 crc kubenswrapper[4852]: kind: Config Dec 10 11:52:19 crc kubenswrapper[4852]: preferences: {} Dec 10 11:52:19 crc kubenswrapper[4852]: users: Dec 10 11:52:19 crc kubenswrapper[4852]: - name: default-auth Dec 10 11:52:19 crc kubenswrapper[4852]: user: Dec 10 11:52:19 crc kubenswrapper[4852]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Dec 10 11:52:19 crc kubenswrapper[4852]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Dec 10 11:52:19 crc kubenswrapper[4852]: EOF Dec 10 11:52:19 crc kubenswrapper[4852]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qgtq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-89m87_openshift-ovn-kubernetes(64c17726-4529-4a16-9d1e-e7e40fa6055a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 10 11:52:19 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 11:52:19 crc kubenswrapper[4852]: E1210 11:52:19.271497 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" podUID="94b935ad-e468-4e03-9bfa-855973944f74" Dec 10 11:52:19 crc kubenswrapper[4852]: E1210 11:52:19.273135 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.277781 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.290911 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.300252 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.308483 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.314767 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.329426 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.329449 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.329457 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.329471 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.329480 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.329623 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.340636 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.350156 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.358836 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.367875 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.375245 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.385690 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.394792 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.403628 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.411793 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.423192 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.431545 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.432047 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.432135 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.432204 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.432304 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.432376 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.449973 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.460450 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.469998 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.477392 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.486054 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.505096 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.531896 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.534808 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.534863 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.534875 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.534895 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.534908 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.637824 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.637886 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.637897 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.637913 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.637924 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.740264 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.740306 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.740316 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.740332 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.740342 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.825517 4852 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.842390 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.842431 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.842442 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.842458 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.842469 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.944626 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.944669 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.944680 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.944692 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:19 crc kubenswrapper[4852]: I1210 11:52:19.944702 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:19Z","lastTransitionTime":"2025-12-10T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.046774 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.046819 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.046836 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.046853 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.046865 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.125765 4852 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-10 11:47:19 +0000 UTC, rotation deadline is 2026-10-17 03:31:10.290001632 +0000 UTC Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.125828 4852 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7455h38m50.164181691s for next certificate rotation Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.149519 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.149577 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.149591 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.149611 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.149626 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.168944 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.168995 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.169041 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.169143 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.169379 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.169565 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.172190 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.172357 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.172389 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:24.17235461 +0000 UTC m=+30.257879854 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.172437 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.172444 4852 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.172504 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:24.172479003 +0000 UTC m=+30.258004287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.172586 4852 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.172632 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:24.172622656 +0000 UTC m=+30.258147880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.251992 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.252066 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.252080 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.252112 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.252128 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.273980 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.274025 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.274181 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.274201 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.274215 4852 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.274214 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.274262 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.274276 4852 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.274304 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:24.274284903 +0000 UTC m=+30.359810127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:20 crc kubenswrapper[4852]: E1210 11:52:20.274354 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:24.274320574 +0000 UTC m=+30.359845958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.355092 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.355146 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.355156 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.355174 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.355185 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.457945 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.457997 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.458014 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.458030 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.458043 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.561115 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.561172 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.561186 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.561206 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.561219 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.640648 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5vx5x"] Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.641288 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.644532 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.644532 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.644825 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.645900 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.653944 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.664525 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.664762 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.664865 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.664955 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.665046 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.677473 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.691473 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.704516 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.716079 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.731117 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.744319 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.757472 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.768555 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.768647 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.768675 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.768704 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.768722 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.771198 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.779026 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqnpf\" (UniqueName: \"kubernetes.io/projected/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-kube-api-access-nqnpf\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.779075 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-host\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.779290 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-serviceca\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.784288 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.795251 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.807625 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.821094 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.870698 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.870736 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.870746 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.870763 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.870774 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.880821 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqnpf\" (UniqueName: \"kubernetes.io/projected/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-kube-api-access-nqnpf\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.880871 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-host\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.881042 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-host\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.880910 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-serviceca\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.882006 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-serviceca\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.901785 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqnpf\" (UniqueName: \"kubernetes.io/projected/defa1e79-6e35-4d5e-b8fc-a3c136208ee9-kube-api-access-nqnpf\") pod \"node-ca-5vx5x\" (UID: \"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\") " pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.952534 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5vx5x" Dec 10 11:52:20 crc kubenswrapper[4852]: W1210 11:52:20.966280 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefa1e79_6e35_4d5e_b8fc_a3c136208ee9.slice/crio-51ea5d393816dcbff4fed3bbd7fe34d3b252608b1940f8d4cf858be31619dd72 WatchSource:0}: Error finding container 51ea5d393816dcbff4fed3bbd7fe34d3b252608b1940f8d4cf858be31619dd72: Status 404 returned error can't find the container with id 51ea5d393816dcbff4fed3bbd7fe34d3b252608b1940f8d4cf858be31619dd72 Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.973054 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.973099 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.973111 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.973130 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:20 crc kubenswrapper[4852]: I1210 11:52:20.973144 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:20Z","lastTransitionTime":"2025-12-10T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.075824 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.076366 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.076377 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.076419 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.076432 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.179690 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.179739 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.179759 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.179776 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.179789 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.275196 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5vx5x" event={"ID":"defa1e79-6e35-4d5e-b8fc-a3c136208ee9","Type":"ContainerStarted","Data":"e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.275263 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5vx5x" event={"ID":"defa1e79-6e35-4d5e-b8fc-a3c136208ee9","Type":"ContainerStarted","Data":"51ea5d393816dcbff4fed3bbd7fe34d3b252608b1940f8d4cf858be31619dd72"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.282920 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.282971 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.282983 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.283001 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.283013 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.287912 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.296791 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.305448 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.317366 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.326412 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.337689 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.350267 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.360489 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.372190 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.385164 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.385769 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.385802 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.385812 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.385826 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.385836 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.397082 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.406549 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.426626 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.488477 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.488516 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.488529 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.488547 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.488560 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.591715 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.591749 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.591762 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.591780 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.591795 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.695970 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.696013 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.696025 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.696040 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.696055 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.798670 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.798717 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.798729 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.798751 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.798764 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.900928 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.900960 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.900971 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.900986 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:21 crc kubenswrapper[4852]: I1210 11:52:21.900995 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:21Z","lastTransitionTime":"2025-12-10T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.003038 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.003068 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.003077 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.003090 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.003098 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.105143 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.105212 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.105251 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.105270 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.105283 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.169825 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:22 crc kubenswrapper[4852]: E1210 11:52:22.169973 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.170011 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.170072 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:22 crc kubenswrapper[4852]: E1210 11:52:22.170166 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:22 crc kubenswrapper[4852]: E1210 11:52:22.170284 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.207619 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.207670 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.207687 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.207710 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.207726 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.310153 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.310210 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.310223 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.310266 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.310281 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.413501 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.413552 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.413563 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.413582 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.413596 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.516813 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.516871 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.516883 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.516904 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.516917 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.619899 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.619949 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.619964 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.619983 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.619994 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.722781 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.722839 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.722857 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.722889 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.722906 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.826010 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.826049 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.826061 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.826079 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.826090 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.928618 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.928667 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.928679 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.928695 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:22 crc kubenswrapper[4852]: I1210 11:52:22.928708 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:22Z","lastTransitionTime":"2025-12-10T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.031189 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.031226 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.031250 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.031265 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.031275 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.134910 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.134967 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.134986 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.135086 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.135097 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.164819 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.170468 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.178170 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.179508 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.192843 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.207613 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.219435 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.227513 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.237790 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.237825 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.237835 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.237851 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.237861 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.244393 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.255436 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.264595 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.271711 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.281006 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: E1210 11:52:23.289393 4852 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.296985 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.308182 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.318970 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.333621 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.340575 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.340609 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.340619 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.340633 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.340644 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.345305 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.353515 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.362118 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.370877 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.380468 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.387812 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.400494 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.411365 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.421616 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.430877 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.440660 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.442536 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.442578 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.442592 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.442608 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.442620 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.449430 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.462429 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.545947 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.545997 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.546008 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.546023 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.546034 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.649054 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.649325 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.649345 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.649368 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.649387 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.752249 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.752316 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.752334 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.752360 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.752379 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.854643 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.854698 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.854720 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.854752 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.854777 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.957895 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.957938 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.957948 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.957968 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:23 crc kubenswrapper[4852]: I1210 11:52:23.957979 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:23Z","lastTransitionTime":"2025-12-10T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.052118 4852 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.084020 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.084485 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.084932 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.084986 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.085061 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.169637 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.170112 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.169633 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.170494 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.169760 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.170983 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.180336 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.189675 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.189973 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.190347 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.190609 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.190934 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.195366 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.208647 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.217192 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.217360 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.217395 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.217418 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:32.217391822 +0000 UTC m=+38.302917046 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.217522 4852 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.217580 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:32.217563116 +0000 UTC m=+38.303088410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.217572 4852 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.217676 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:32.217654779 +0000 UTC m=+38.303180033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.221220 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.233572 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.245945 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.260729 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.271841 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.282594 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.292658 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.294909 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.294957 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.294967 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.294983 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.294994 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.308876 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.318720 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.318783 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.318955 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.318999 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.319011 4852 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.319079 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:32.319063369 +0000 UTC m=+38.404588593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.318969 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.319106 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.319120 4852 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:24 crc kubenswrapper[4852]: E1210 11:52:24.319172 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:32.319155562 +0000 UTC m=+38.404680866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.322698 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.333954 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.343556 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.397482 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.397533 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.397547 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.397567 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.397582 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.500737 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.500779 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.500793 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.500812 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.500825 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.603874 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.603918 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.603928 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.603946 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.603958 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.707031 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.707092 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.707104 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.707140 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.707158 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.809574 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.809664 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.809680 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.809702 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.809713 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.912064 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.912127 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.912138 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.912156 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:24 crc kubenswrapper[4852]: I1210 11:52:24.912169 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:24Z","lastTransitionTime":"2025-12-10T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.015394 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.015513 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.015528 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.015546 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.015555 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.117594 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.117652 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.117668 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.117685 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.117696 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.220857 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.220918 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.220930 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.220949 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.220961 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.323126 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.323170 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.323180 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.323195 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.323207 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.425138 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.425194 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.425206 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.425224 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.425267 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.528997 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.529041 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.529059 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.529080 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.529092 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.632811 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.632873 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.632884 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.632905 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.632920 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.735861 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.735919 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.735934 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.735956 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.735971 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.839018 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.839075 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.839087 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.839107 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.839120 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.942150 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.942204 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.942220 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.942263 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:25 crc kubenswrapper[4852]: I1210 11:52:25.942278 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:25Z","lastTransitionTime":"2025-12-10T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.044869 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.044989 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.045004 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.045021 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.045033 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.147590 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.147646 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.147660 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.147679 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.147691 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.169511 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.169585 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.169614 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:26 crc kubenswrapper[4852]: E1210 11:52:26.169662 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:26 crc kubenswrapper[4852]: E1210 11:52:26.169914 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:26 crc kubenswrapper[4852]: E1210 11:52:26.169814 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.250741 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.251671 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.251692 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.251710 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.251722 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.356912 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.356956 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.356979 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.357007 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.357022 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.460322 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.460377 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.460393 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.460412 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.460426 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.562977 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.563020 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.563029 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.563042 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.563052 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.666197 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.666287 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.666303 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.666327 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.666343 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.769314 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.769369 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.769382 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.769404 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.769417 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.873298 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.873615 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.873637 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.873667 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.873684 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.976628 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.976705 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.976735 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.976758 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:26 crc kubenswrapper[4852]: I1210 11:52:26.976771 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:26Z","lastTransitionTime":"2025-12-10T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.080130 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.080182 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.080194 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.080211 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.080239 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.184023 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.184106 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.184117 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.184139 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.184152 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.254803 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.254830 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.254838 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.254859 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.254869 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: E1210 11:52:27.280442 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.285853 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.285932 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.285950 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.285971 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.285985 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: E1210 11:52:27.298479 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.302839 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.302891 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.302905 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.302928 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.302944 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: E1210 11:52:27.312524 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.324088 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.324140 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.324153 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.324171 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.324185 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: E1210 11:52:27.335073 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.340006 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.340073 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.340090 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.340577 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.340624 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: E1210 11:52:27.354697 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ec81da2d-010c-440a-a2c9-f3547047ac06\\\",\\\"systemUUID\\\":\\\"8aceede4-7323-43e8-a979-088fd86df0ad\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:27 crc kubenswrapper[4852]: E1210 11:52:27.354878 4852 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.357014 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.357092 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.357111 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.357143 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.357162 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.460596 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.460655 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.460669 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.460692 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.460707 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.564292 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.564349 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.564361 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.564383 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.564398 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.667645 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.667712 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.667725 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.667747 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.667761 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.770573 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.770624 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.770636 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.770656 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.770672 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.873816 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.873872 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.873885 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.873909 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.873924 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.978210 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.978334 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.978350 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.978378 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:27 crc kubenswrapper[4852]: I1210 11:52:27.978395 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:27Z","lastTransitionTime":"2025-12-10T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.081191 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.081268 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.081278 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.081313 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.081329 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.169054 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.169149 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.169152 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:28 crc kubenswrapper[4852]: E1210 11:52:28.169287 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:28 crc kubenswrapper[4852]: E1210 11:52:28.169792 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:28 crc kubenswrapper[4852]: E1210 11:52:28.170024 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.184112 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.184151 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.184161 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.184178 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.184188 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.287961 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.288025 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.288039 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.288062 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.288078 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.391483 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.391530 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.391542 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.391558 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.391570 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.493980 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.494036 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.494050 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.494066 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.494077 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.596952 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.597013 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.597027 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.597043 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.597053 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.699800 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.699855 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.699868 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.699885 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.699906 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.802791 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.802850 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.802875 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.802900 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.802916 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.906694 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.906733 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.906741 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.906755 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:28 crc kubenswrapper[4852]: I1210 11:52:28.906766 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:28Z","lastTransitionTime":"2025-12-10T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.010045 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.010075 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.010282 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.010301 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.010310 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.112772 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.112834 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.112845 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.112862 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.112876 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.216087 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.216175 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.216188 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.216208 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.216220 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.318463 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.318521 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.318535 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.318555 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.318568 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.421269 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.421317 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.421329 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.421346 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.421357 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.523630 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.523673 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.523684 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.523699 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.523709 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.626000 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.626037 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.626060 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.626077 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.626088 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.729120 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.729169 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.729179 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.729195 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.729205 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.831197 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.831259 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.831272 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.831288 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.831300 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.933304 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.933341 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.933349 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.933366 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:29 crc kubenswrapper[4852]: I1210 11:52:29.933383 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:29Z","lastTransitionTime":"2025-12-10T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.035916 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.035960 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.035970 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.035987 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.036000 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.138344 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.138384 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.138393 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.138408 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.138417 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.168848 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:30 crc kubenswrapper[4852]: E1210 11:52:30.168990 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.169013 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.169034 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:30 crc kubenswrapper[4852]: E1210 11:52:30.169166 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:30 crc kubenswrapper[4852]: E1210 11:52:30.169380 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.241192 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.241266 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.241281 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.241300 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.241314 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.344866 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.344913 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.344925 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.344946 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.344958 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.407175 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6"] Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.407705 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.411144 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.411636 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.420319 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.430937 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.440085 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.447369 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.447394 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.447401 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.447415 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.447426 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.452018 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.468339 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.487266 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.490680 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/127d1270-775c-4908-88e1-650a4bd172dd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.490725 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/127d1270-775c-4908-88e1-650a4bd172dd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.490747 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6cmg\" (UniqueName: \"kubernetes.io/projected/127d1270-775c-4908-88e1-650a4bd172dd-kube-api-access-n6cmg\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.490801 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/127d1270-775c-4908-88e1-650a4bd172dd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.497626 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127d1270-775c-4908-88e1-650a4bd172dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mgt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.525005 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.544610 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.549723 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.549762 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.549773 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.549789 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.549800 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.557323 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.569643 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.580675 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.591759 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/127d1270-775c-4908-88e1-650a4bd172dd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.591816 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/127d1270-775c-4908-88e1-650a4bd172dd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.591838 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6cmg\" (UniqueName: \"kubernetes.io/projected/127d1270-775c-4908-88e1-650a4bd172dd-kube-api-access-n6cmg\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.591918 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/127d1270-775c-4908-88e1-650a4bd172dd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.592804 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/127d1270-775c-4908-88e1-650a4bd172dd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.593044 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/127d1270-775c-4908-88e1-650a4bd172dd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.599079 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.603039 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/127d1270-775c-4908-88e1-650a4bd172dd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.609341 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6cmg\" (UniqueName: \"kubernetes.io/projected/127d1270-775c-4908-88e1-650a4bd172dd-kube-api-access-n6cmg\") pod \"ovnkube-control-plane-749d76644c-2mgt6\" (UID: \"127d1270-775c-4908-88e1-650a4bd172dd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.611575 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.622005 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.653252 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.653289 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.653298 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.653314 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.653324 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.720037 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" Dec 10 11:52:30 crc kubenswrapper[4852]: W1210 11:52:30.742149 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127d1270_775c_4908_88e1_650a4bd172dd.slice/crio-7175871554b605e0d904da0227cee711cb36ebfcda11f1a23c95dbd33c87cb51 WatchSource:0}: Error finding container 7175871554b605e0d904da0227cee711cb36ebfcda11f1a23c95dbd33c87cb51: Status 404 returned error can't find the container with id 7175871554b605e0d904da0227cee711cb36ebfcda11f1a23c95dbd33c87cb51 Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.755784 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.755833 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.755845 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.755864 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.755876 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.858759 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.858813 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.858825 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.858845 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.858857 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.962314 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.962368 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.962382 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.962406 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:30 crc kubenswrapper[4852]: I1210 11:52:30.962419 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:30Z","lastTransitionTime":"2025-12-10T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.066225 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.066281 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.066292 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.066311 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.066324 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:31Z","lastTransitionTime":"2025-12-10T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.168209 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.168278 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.168287 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.168304 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.168314 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:31Z","lastTransitionTime":"2025-12-10T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.270653 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.270694 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.270705 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.270721 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.270731 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:31Z","lastTransitionTime":"2025-12-10T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.350760 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.354541 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" event={"ID":"127d1270-775c-4908-88e1-650a4bd172dd","Type":"ContainerStarted","Data":"7175871554b605e0d904da0227cee711cb36ebfcda11f1a23c95dbd33c87cb51"} Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.367145 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.373189 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.373225 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.373259 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.373281 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.373294 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:31Z","lastTransitionTime":"2025-12-10T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.381605 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.392847 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.403591 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.415561 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.426828 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.440761 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.457644 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.472079 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.799544 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.799595 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.799621 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.799643 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.799655 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:31Z","lastTransitionTime":"2025-12-10T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.817726 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.831368 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.840651 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-bjxbn"] Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.841093 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:31 crc kubenswrapper[4852]: E1210 11:52:31.841165 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.843840 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.851835 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.871063 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.884198 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127d1270-775c-4908-88e1-650a4bd172dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mgt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.896076 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.900968 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.901012 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vb77\" (UniqueName: \"kubernetes.io/projected/d4917776-2f46-46af-bd13-db5745bfdbf0-kube-api-access-6vb77\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.901878 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.901939 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.901951 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.901965 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.901975 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:31Z","lastTransitionTime":"2025-12-10T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.908027 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.918999 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.930311 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.946489 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.960906 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.977251 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:31 crc kubenswrapper[4852]: I1210 11:52:31.993289 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.004743 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.004911 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vb77\" (UniqueName: \"kubernetes.io/projected/d4917776-2f46-46af-bd13-db5745bfdbf0-kube-api-access-6vb77\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.005103 4852 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.005203 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.005208 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs podName:d4917776-2f46-46af-bd13-db5745bfdbf0 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:32.505182337 +0000 UTC m=+38.590707721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs") pod "network-metrics-daemon-bjxbn" (UID: "d4917776-2f46-46af-bd13-db5745bfdbf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.005266 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.005298 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.005324 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.005337 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.007538 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.022739 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bjxbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4917776-2f46-46af-bd13-db5745bfdbf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bjxbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.023874 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vb77\" (UniqueName: \"kubernetes.io/projected/d4917776-2f46-46af-bd13-db5745bfdbf0-kube-api-access-6vb77\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.048659 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.059803 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127d1270-775c-4908-88e1-650a4bd172dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mgt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.069752 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.082064 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.094510 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.105372 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.107536 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.107581 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.107593 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.107614 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.107629 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.169533 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.169665 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.169715 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.169995 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.170183 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.170354 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.210922 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.211349 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.211385 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.211402 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.211411 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.306959 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.307112 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.307148 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.307192 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:48.30715036 +0000 UTC m=+54.392675584 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.307294 4852 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.307330 4852 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.307373 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:48.307354046 +0000 UTC m=+54.392879450 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.307477 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:48.307454038 +0000 UTC m=+54.392979422 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.314896 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.314940 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.314951 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.314967 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.314979 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.363034 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" event={"ID":"127d1270-775c-4908-88e1-650a4bd172dd","Type":"ContainerStarted","Data":"948bc592f30293609bf7e2b14c7b3aa01ddb2579ead6fc049b35da0d9daab301"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.364451 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"743167c33577d63ff7f9ed9e498e6102b6dd74b26ac4c1e71f992747db7b0d68"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.408027 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.408081 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.408201 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.408216 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.408259 4852 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.408287 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.408349 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.408366 4852 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.408307 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:48.408293034 +0000 UTC m=+54.493818258 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.408508 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:48.408460538 +0000 UTC m=+54.493985762 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.417975 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.418025 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.418038 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.418055 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.418067 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.509072 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.509851 4852 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: E1210 11:52:32.509983 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs podName:d4917776-2f46-46af-bd13-db5745bfdbf0 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:33.509958591 +0000 UTC m=+39.595483815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs") pod "network-metrics-daemon-bjxbn" (UID: "d4917776-2f46-46af-bd13-db5745bfdbf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.520726 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.520766 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.520775 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.520791 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.520801 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.623435 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.623482 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.623494 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.623510 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.623523 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.726394 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.726460 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.726476 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.726500 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.726516 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.829656 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.829723 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.829738 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.829766 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.829781 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.932827 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.932890 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.932913 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.932935 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:32 crc kubenswrapper[4852]: I1210 11:52:32.932952 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:32Z","lastTransitionTime":"2025-12-10T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.035858 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.035950 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.035966 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.035987 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.036000 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.140593 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.140646 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.140658 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.140680 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.140694 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.169495 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:33 crc kubenswrapper[4852]: E1210 11:52:33.170299 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.243302 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.243347 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.243356 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.243371 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.243380 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.346040 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.346081 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.346095 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.346111 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.346121 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.368276 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4f72be959c9139862899929e8ff8661a90b71cee828d23e2d6f8fa5c8fd8df99"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.369391 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qbbd2" event={"ID":"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd","Type":"ContainerStarted","Data":"5f6a77195e2ca6a89769317cd943cc823732a0c4b76ff29c22356bd1690c83a2"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.370183 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzcx9" event={"ID":"7d01ef2d-58af-42c3-b716-9020614e2a09","Type":"ContainerStarted","Data":"181b8ac1ab6e03b4b0d6a5b4b56de675b0bf62b450ded27a34cacc5edfa7a57e"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.373850 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"64c51de8669ad2ce08355263b0a7ce8961e927cd250db9be695161f42255e538"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.385098 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.396510 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.410364 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.424469 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181b8ac1ab6e03b4b0d6a5b4b56de675b0bf62b450ded27a34cacc5edfa7a57e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.435868 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bjxbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4917776-2f46-46af-bd13-db5745bfdbf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bjxbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.447810 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.449141 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.449203 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.449216 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.449252 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.449265 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.468248 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.481010 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127d1270-775c-4908-88e1-650a4bd172dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mgt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.497127 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.509861 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.521205 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:33 crc kubenswrapper[4852]: E1210 11:52:33.521422 4852 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:33 crc kubenswrapper[4852]: E1210 11:52:33.521480 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs podName:d4917776-2f46-46af-bd13-db5745bfdbf0 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:35.521465942 +0000 UTC m=+41.606991166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs") pod "network-metrics-daemon-bjxbn" (UID: "d4917776-2f46-46af-bd13-db5745bfdbf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.522991 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.537568 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.562659 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.564675 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.564704 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.564715 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.564731 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.564742 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.583389 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.596584 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.612652 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.628324 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.645982 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.661779 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181b8ac1ab6e03b4b0d6a5b4b56de675b0bf62b450ded27a34cacc5edfa7a57e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.666742 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.666791 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.666807 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.666825 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.666840 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.675761 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bjxbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4917776-2f46-46af-bd13-db5745bfdbf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bjxbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.691457 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.705263 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127d1270-775c-4908-88e1-650a4bd172dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mgt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.715926 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.739668 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.751648 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.760729 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.769116 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.769152 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.769164 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.769181 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.769192 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.782609 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.802861 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.817882 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743167c33577d63ff7f9ed9e498e6102b6dd74b26ac4c1e71f992747db7b0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.833979 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.851520 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.865145 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:33Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.871645 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.871716 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.871729 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.871750 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.871780 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.974411 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.974449 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.974459 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.974473 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:33 crc kubenswrapper[4852]: I1210 11:52:33.974484 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:33Z","lastTransitionTime":"2025-12-10T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.077401 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.077437 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.077448 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.077469 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.077480 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.169457 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:34 crc kubenswrapper[4852]: E1210 11:52:34.169618 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.169390 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:34 crc kubenswrapper[4852]: E1210 11:52:34.170495 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.170640 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:34 crc kubenswrapper[4852]: E1210 11:52:34.170723 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.179518 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.179553 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.179561 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.179575 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.179587 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.188059 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.204736 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.221922 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.236404 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181b8ac1ab6e03b4b0d6a5b4b56de675b0bf62b450ded27a34cacc5edfa7a57e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.249046 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bjxbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4917776-2f46-46af-bd13-db5745bfdbf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bjxbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.271275 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.283733 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.283799 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.283815 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.283843 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.283859 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.292658 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127d1270-775c-4908-88e1-650a4bd172dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mgt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.310864 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.331926 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.346462 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.359014 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.374197 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.386986 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.387035 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.387047 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.387065 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.387079 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.390736 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.404752 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743167c33577d63ff7f9ed9e498e6102b6dd74b26ac4c1e71f992747db7b0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.419417 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.437729 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.451675 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743167c33577d63ff7f9ed9e498e6102b6dd74b26ac4c1e71f992747db7b0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.465363 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.487567 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94b935ad-e468-4e03-9bfa-855973944f74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cm4wn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dfbl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.490196 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.490273 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.490289 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.490316 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.490330 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.515346 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.533100 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.549022 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.570866 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mzcx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d01ef2d-58af-42c3-b716-9020614e2a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181b8ac1ab6e03b4b0d6a5b4b56de675b0bf62b450ded27a34cacc5edfa7a57e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrh7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mzcx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.581417 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bjxbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4917776-2f46-46af-bd13-db5745bfdbf0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vb77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bjxbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.593296 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.593325 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.593335 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.593349 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.593358 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.596647 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.614335 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c51de8669ad2ce08355263b0a7ce8961e927cd250db9be695161f42255e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.628817 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qbbd2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa2206a-32c7-4bcc-8899-9bb9742ba9fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6a77195e2ca6a89769317cd943cc823732a0c4b76ff29c22356bd1690c83a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d56z4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qbbd2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.651764 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64c17726-4529-4a16-9d1e-e7e40fa6055a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qgtq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-89m87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.669428 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127d1270-775c-4908-88e1-650a4bd172dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6cmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2mgt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.680609 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vx5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"defa1e79-6e35-4d5e-b8fc-a3c136208ee9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84dc258f0f8d1f10ae6402bbdfcf0e335152c11f257aa0393b66cd0506a8e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqnpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vx5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.696945 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.697023 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.697035 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.697061 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.697076 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.700359 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1400578-050a-4290-92f9-5db657016b2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-10T11:52:15Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1210 11:52:09.812071 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1210 11:52:09.813940 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1241641919/tls.crt::/tmp/serving-cert-1241641919/tls.key\\\\\\\"\\\\nI1210 11:52:15.612375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1210 11:52:15.615277 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1210 11:52:15.615302 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1210 11:52:15.615328 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1210 11:52:15.615336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1210 11:52:15.620457 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1210 11:52:15.620510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620516 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1210 11:52:15.620522 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1210 11:52:15.620527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1210 11:52:15.620531 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1210 11:52:15.620535 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1210 11:52:15.620896 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1210 11:52:15.624033 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-10T11:51:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.717308 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:34Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.799721 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.799975 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.800033 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.800100 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.800175 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.904065 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.904146 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.904167 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.904200 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:34 crc kubenswrapper[4852]: I1210 11:52:34.904218 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:34Z","lastTransitionTime":"2025-12-10T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.007499 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.007563 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.007578 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.007599 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.007610 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.109880 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.110275 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.110288 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.110304 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.110313 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.169755 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:35 crc kubenswrapper[4852]: E1210 11:52:35.169950 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.212377 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.212411 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.212422 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.212440 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.212452 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.314863 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.314898 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.314906 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.314922 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.314931 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.380277 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"df5387583f66f93b26a76954748f69c02df08bb9c349c9c9465ec2fb73fa4fd0"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.381544 4852 generic.go:334] "Generic (PLEG): container finished" podID="94b935ad-e468-4e03-9bfa-855973944f74" containerID="5ae782355ffdea4349276426ae371cbe4403c1aa108e2e831559e6890d00f2d0" exitCode=0 Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.381655 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerDied","Data":"5ae782355ffdea4349276426ae371cbe4403c1aa108e2e831559e6890d00f2d0"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.383386 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c0dbc7d8f2d6dda3d0964e1b3b2eb5bfb6fd5f7254be4fd341a4b43e87c06011"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.388589 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="e5d7d4620ee981f00c1e19c2f2c5b22985fd554dfb2b8c689401b1fefc33ce13" exitCode=0 Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.388662 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"e5d7d4620ee981f00c1e19c2f2c5b22985fd554dfb2b8c689401b1fefc33ce13"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.391052 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" event={"ID":"127d1270-775c-4908-88e1-650a4bd172dd","Type":"ContainerStarted","Data":"0c067ca30b0358ad038ca8faf657aab4b950cceb0f2b5506cdf235fa9b555a71"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.400872 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"151508f1-43dd-44fd-80fe-83a65a296517\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d0bdcc57efa54db9728a49998f7bd563e875a18a60b60075656c8ad684ce4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b7655552697d0417a1ef7fb2cfdcf898d796f698b4133a2fd35fa4a8c55d1d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c50bd8e351b7a763e803601753b535e5c485d442f7ba8c0d37670098031209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:51:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:35Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.416359 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:35Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.418114 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.418161 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.418173 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.418188 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.418197 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.438200 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://743167c33577d63ff7f9ed9e498e6102b6dd74b26ac4c1e71f992747db7b0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-10T11:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:35Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.451027 4852 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06184023-d738-4d23-ae7e-bc0dde135fa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-10T11:52:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cscwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-10T11:52:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-thqgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-10T11:52:35Z is after 2025-08-24T17:21:41Z" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.520730 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.520763 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.520774 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.520790 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.520799 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.547125 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:35 crc kubenswrapper[4852]: E1210 11:52:35.547326 4852 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:35 crc kubenswrapper[4852]: E1210 11:52:35.547383 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs podName:d4917776-2f46-46af-bd13-db5745bfdbf0 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:39.547365465 +0000 UTC m=+45.632890699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs") pod "network-metrics-daemon-bjxbn" (UID: "d4917776-2f46-46af-bd13-db5745bfdbf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.623151 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.623202 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.623214 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.623254 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.623269 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.701299 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mzcx9" podStartSLOduration=18.701275927 podStartE2EDuration="18.701275927s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:35.680880825 +0000 UTC m=+41.766406059" watchObservedRunningTime="2025-12-10 11:52:35.701275927 +0000 UTC m=+41.786801161" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.726493 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.726549 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.726563 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.726585 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.726599 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.762628 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qbbd2" podStartSLOduration=18.762611253 podStartE2EDuration="18.762611253s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:35.762594313 +0000 UTC m=+41.848119557" watchObservedRunningTime="2025-12-10 11:52:35.762611253 +0000 UTC m=+41.848136477" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.783169 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.783153048 podStartE2EDuration="19.783153048s" podCreationTimestamp="2025-12-10 11:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:35.780222513 +0000 UTC m=+41.865747767" watchObservedRunningTime="2025-12-10 11:52:35.783153048 +0000 UTC m=+41.868678272" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.809633 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5vx5x" podStartSLOduration=18.809610914 podStartE2EDuration="18.809610914s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:35.809127882 +0000 UTC m=+41.894653106" watchObservedRunningTime="2025-12-10 11:52:35.809610914 +0000 UTC m=+41.895136148" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.825154 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=12.825138701 podStartE2EDuration="12.825138701s" podCreationTimestamp="2025-12-10 11:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:35.824867204 +0000 UTC m=+41.910392428" watchObservedRunningTime="2025-12-10 11:52:35.825138701 +0000 UTC m=+41.910663925" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.828624 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.828680 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.828691 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.828934 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.828980 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.932133 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.932171 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.932183 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.932201 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.932213 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:35Z","lastTransitionTime":"2025-12-10T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:35 crc kubenswrapper[4852]: I1210 11:52:35.933695 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2mgt6" podStartSLOduration=18.933684264 podStartE2EDuration="18.933684264s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:35.93354032 +0000 UTC m=+42.019065564" watchObservedRunningTime="2025-12-10 11:52:35.933684264 +0000 UTC m=+42.019209498" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.034284 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.034334 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.034349 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.034367 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.034380 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.137635 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.137684 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.137697 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.137718 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.137730 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.169670 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.169735 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.169960 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:36 crc kubenswrapper[4852]: E1210 11:52:36.169991 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:36 crc kubenswrapper[4852]: E1210 11:52:36.169967 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:36 crc kubenswrapper[4852]: E1210 11:52:36.170135 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.241679 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.241989 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.241998 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.242014 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.242024 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.345416 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.345475 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.345487 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.345506 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.345517 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.397931 4852 generic.go:334] "Generic (PLEG): container finished" podID="94b935ad-e468-4e03-9bfa-855973944f74" containerID="88386318507e950b57d69d35b96eee1b4a6b5aa1f49c275bedd0e89c2cf57b02" exitCode=0 Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.397997 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerDied","Data":"88386318507e950b57d69d35b96eee1b4a6b5aa1f49c275bedd0e89c2cf57b02"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.419536 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"e2e4371fec6226d65683a3a8c8b2f400e69e6f0881a7b801f3c06d62256332df"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.419588 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"0e72f2311dc0b72b226ef1f9b878c8cdd81b928d1157cad951ad4bbe6117f79f"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.425851 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"da09526f01cd76ec3d5755bb18981b41f79227cc49a4533029b6377276b308d6"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.443915 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podStartSLOduration=19.443883357 podStartE2EDuration="19.443883357s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:36.44319997 +0000 UTC m=+42.528725194" watchObservedRunningTime="2025-12-10 11:52:36.443883357 +0000 UTC m=+42.529408611" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.448319 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.448399 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.448413 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.448432 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.448444 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.556866 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.556943 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.556958 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.556984 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.557004 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.660578 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.660618 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.660627 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.660642 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.660654 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.765706 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.766115 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.766129 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.766148 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.766160 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.869771 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.869851 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.869865 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.869886 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.869898 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.973508 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.973557 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.973570 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.973596 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:36 crc kubenswrapper[4852]: I1210 11:52:36.973609 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:36Z","lastTransitionTime":"2025-12-10T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.076684 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.076710 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.076717 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.076730 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.076740 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:37Z","lastTransitionTime":"2025-12-10T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.169663 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:37 crc kubenswrapper[4852]: E1210 11:52:37.169809 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.180533 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.180672 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.180685 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.180701 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.180714 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:37Z","lastTransitionTime":"2025-12-10T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.284936 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.284987 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.284999 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.285025 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.285039 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:37Z","lastTransitionTime":"2025-12-10T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.390174 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.390252 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.390267 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.390295 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.390313 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:37Z","lastTransitionTime":"2025-12-10T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.432626 4852 generic.go:334] "Generic (PLEG): container finished" podID="94b935ad-e468-4e03-9bfa-855973944f74" containerID="4b48507ccc401ac32927271ed454b84639744ff7b29087d6e29e20785ba67125" exitCode=0 Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.432707 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerDied","Data":"4b48507ccc401ac32927271ed454b84639744ff7b29087d6e29e20785ba67125"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.438001 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"7628905c2e633e97f2fe958e4211f5b02890b8fe5105b17966eb92658c9b9012"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.438067 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"6901027836a6617e3b89c30b0d07101b2e21384249f3b74c9f39334e3aeab0e6"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.438088 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"a1028602c7fa4ce561b0d24d016dd49368a7bc7a4d7f6c4cd21254651b84eb77"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.438106 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"986c5d5945116451836d213d41a0bd862b4c52bfa735a9d983424a109980a3c4"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.495061 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.495133 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.495585 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.495618 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.495633 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:37Z","lastTransitionTime":"2025-12-10T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.598337 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.598377 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.598390 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.598404 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.598414 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:37Z","lastTransitionTime":"2025-12-10T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.702592 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.702650 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.702663 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.702682 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.702694 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:37Z","lastTransitionTime":"2025-12-10T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.735619 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.735675 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.735689 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.735708 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.735721 4852 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-10T11:52:37Z","lastTransitionTime":"2025-12-10T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.784014 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g"] Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.784455 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.786273 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.786649 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.786878 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.787291 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.974506 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/581112b1-1950-4ec4-9216-a7ab48de68fb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.974585 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581112b1-1950-4ec4-9216-a7ab48de68fb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.974616 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/581112b1-1950-4ec4-9216-a7ab48de68fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.974697 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/581112b1-1950-4ec4-9216-a7ab48de68fb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:37 crc kubenswrapper[4852]: I1210 11:52:37.974736 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/581112b1-1950-4ec4-9216-a7ab48de68fb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.075078 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/581112b1-1950-4ec4-9216-a7ab48de68fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.075125 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/581112b1-1950-4ec4-9216-a7ab48de68fb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.075160 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/581112b1-1950-4ec4-9216-a7ab48de68fb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.075201 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/581112b1-1950-4ec4-9216-a7ab48de68fb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.075891 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581112b1-1950-4ec4-9216-a7ab48de68fb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.076188 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/581112b1-1950-4ec4-9216-a7ab48de68fb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.076519 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/581112b1-1950-4ec4-9216-a7ab48de68fb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.076561 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/581112b1-1950-4ec4-9216-a7ab48de68fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.081635 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581112b1-1950-4ec4-9216-a7ab48de68fb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.093435 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/581112b1-1950-4ec4-9216-a7ab48de68fb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b5z5g\" (UID: \"581112b1-1950-4ec4-9216-a7ab48de68fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.102010 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" Dec 10 11:52:38 crc kubenswrapper[4852]: W1210 11:52:38.119617 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod581112b1_1950_4ec4_9216_a7ab48de68fb.slice/crio-b9f0db9be60e2fad353e34123488090ba7c416c24a31b447447a9c6f16ff9e74 WatchSource:0}: Error finding container b9f0db9be60e2fad353e34123488090ba7c416c24a31b447447a9c6f16ff9e74: Status 404 returned error can't find the container with id b9f0db9be60e2fad353e34123488090ba7c416c24a31b447447a9c6f16ff9e74 Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.169019 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:38 crc kubenswrapper[4852]: E1210 11:52:38.169188 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.169049 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:38 crc kubenswrapper[4852]: E1210 11:52:38.169304 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.169041 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:38 crc kubenswrapper[4852]: E1210 11:52:38.169375 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.448077 4852 generic.go:334] "Generic (PLEG): container finished" podID="94b935ad-e468-4e03-9bfa-855973944f74" containerID="b3ad28d74bf572ad04ff170a28bf593f6088bbd73862034161067efa1135ade0" exitCode=0 Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.448162 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerDied","Data":"b3ad28d74bf572ad04ff170a28bf593f6088bbd73862034161067efa1135ade0"} Dec 10 11:52:38 crc kubenswrapper[4852]: I1210 11:52:38.449956 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" event={"ID":"581112b1-1950-4ec4-9216-a7ab48de68fb","Type":"ContainerStarted","Data":"b9f0db9be60e2fad353e34123488090ba7c416c24a31b447447a9c6f16ff9e74"} Dec 10 11:52:39 crc kubenswrapper[4852]: I1210 11:52:39.169273 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:39 crc kubenswrapper[4852]: E1210 11:52:39.169972 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:39 crc kubenswrapper[4852]: I1210 11:52:39.596036 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:39 crc kubenswrapper[4852]: E1210 11:52:39.596348 4852 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:39 crc kubenswrapper[4852]: E1210 11:52:39.597212 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs podName:d4917776-2f46-46af-bd13-db5745bfdbf0 nodeName:}" failed. No retries permitted until 2025-12-10 11:52:47.5971778 +0000 UTC m=+53.682703024 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs") pod "network-metrics-daemon-bjxbn" (UID: "d4917776-2f46-46af-bd13-db5745bfdbf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:40 crc kubenswrapper[4852]: I1210 11:52:40.170996 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:40 crc kubenswrapper[4852]: I1210 11:52:40.171137 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:40 crc kubenswrapper[4852]: E1210 11:52:40.171250 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:40 crc kubenswrapper[4852]: E1210 11:52:40.171379 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:40 crc kubenswrapper[4852]: I1210 11:52:40.171524 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:40 crc kubenswrapper[4852]: E1210 11:52:40.171615 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:40 crc kubenswrapper[4852]: I1210 11:52:40.463154 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"56bc33d68cf0b28a9bda8ec77881ce204966237302fdb28f2d71601eccd79938"} Dec 10 11:52:40 crc kubenswrapper[4852]: I1210 11:52:40.467303 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerStarted","Data":"13c987a7a3e8b681774f4685af14cb2c59ec7c4de89afe9d959a08c9935c61bc"} Dec 10 11:52:40 crc kubenswrapper[4852]: I1210 11:52:40.469074 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" event={"ID":"581112b1-1950-4ec4-9216-a7ab48de68fb","Type":"ContainerStarted","Data":"8e7f4c29d12893be832f83fc7bd14b20a9c2fdf2b15186f2221d8150cecf3132"} Dec 10 11:52:40 crc kubenswrapper[4852]: I1210 11:52:40.507746 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b5z5g" podStartSLOduration=23.507713661 podStartE2EDuration="23.507713661s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:40.50727462 +0000 UTC m=+46.592799864" watchObservedRunningTime="2025-12-10 11:52:40.507713661 +0000 UTC m=+46.593238885" Dec 10 11:52:41 crc kubenswrapper[4852]: I1210 11:52:41.169429 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:41 crc kubenswrapper[4852]: E1210 11:52:41.169594 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:41 crc kubenswrapper[4852]: I1210 11:52:41.478553 4852 generic.go:334] "Generic (PLEG): container finished" podID="94b935ad-e468-4e03-9bfa-855973944f74" containerID="13c987a7a3e8b681774f4685af14cb2c59ec7c4de89afe9d959a08c9935c61bc" exitCode=0 Dec 10 11:52:41 crc kubenswrapper[4852]: I1210 11:52:41.478641 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerDied","Data":"13c987a7a3e8b681774f4685af14cb2c59ec7c4de89afe9d959a08c9935c61bc"} Dec 10 11:52:42 crc kubenswrapper[4852]: I1210 11:52:42.169091 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:42 crc kubenswrapper[4852]: I1210 11:52:42.169256 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:42 crc kubenswrapper[4852]: I1210 11:52:42.169402 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:42 crc kubenswrapper[4852]: E1210 11:52:42.169411 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:42 crc kubenswrapper[4852]: E1210 11:52:42.169549 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:42 crc kubenswrapper[4852]: E1210 11:52:42.169690 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:43 crc kubenswrapper[4852]: I1210 11:52:43.169343 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:43 crc kubenswrapper[4852]: E1210 11:52:43.170010 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:43 crc kubenswrapper[4852]: I1210 11:52:43.489101 4852 generic.go:334] "Generic (PLEG): container finished" podID="94b935ad-e468-4e03-9bfa-855973944f74" containerID="c1c04b7cec142318ff2b022569f921598efb5f21acd32fc9f79814b0ec9bbb15" exitCode=0 Dec 10 11:52:43 crc kubenswrapper[4852]: I1210 11:52:43.489185 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerDied","Data":"c1c04b7cec142318ff2b022569f921598efb5f21acd32fc9f79814b0ec9bbb15"} Dec 10 11:52:43 crc kubenswrapper[4852]: I1210 11:52:43.496979 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerStarted","Data":"82f8dd56768dbe144d4e28db2d0938df758f701faadc338df75e9fb7c3a13f43"} Dec 10 11:52:43 crc kubenswrapper[4852]: I1210 11:52:43.560300 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podStartSLOduration=26.56028027 podStartE2EDuration="26.56028027s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:43.559421158 +0000 UTC m=+49.644946402" watchObservedRunningTime="2025-12-10 11:52:43.56028027 +0000 UTC m=+49.645805514" Dec 10 11:52:44 crc kubenswrapper[4852]: I1210 11:52:44.169500 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:44 crc kubenswrapper[4852]: I1210 11:52:44.169504 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:44 crc kubenswrapper[4852]: I1210 11:52:44.169569 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:44 crc kubenswrapper[4852]: E1210 11:52:44.172416 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:44 crc kubenswrapper[4852]: E1210 11:52:44.172501 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:44 crc kubenswrapper[4852]: E1210 11:52:44.172581 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:44 crc kubenswrapper[4852]: I1210 11:52:44.500655 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:44 crc kubenswrapper[4852]: I1210 11:52:44.500701 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:44 crc kubenswrapper[4852]: I1210 11:52:44.500713 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:44 crc kubenswrapper[4852]: I1210 11:52:44.861219 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:44 crc kubenswrapper[4852]: I1210 11:52:44.867169 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:45 crc kubenswrapper[4852]: I1210 11:52:45.169370 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:45 crc kubenswrapper[4852]: E1210 11:52:45.169581 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:46 crc kubenswrapper[4852]: I1210 11:52:46.168924 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:46 crc kubenswrapper[4852]: I1210 11:52:46.169039 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:46 crc kubenswrapper[4852]: E1210 11:52:46.169048 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:46 crc kubenswrapper[4852]: I1210 11:52:46.168920 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:46 crc kubenswrapper[4852]: E1210 11:52:46.169227 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:46 crc kubenswrapper[4852]: E1210 11:52:46.169295 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:46 crc kubenswrapper[4852]: I1210 11:52:46.510182 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" event={"ID":"94b935ad-e468-4e03-9bfa-855973944f74","Type":"ContainerStarted","Data":"f1725fe0c8e19830e42b62c0356f794b367a8bb38824ab9213ab3e22f3645ff2"} Dec 10 11:52:47 crc kubenswrapper[4852]: I1210 11:52:47.169739 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:47 crc kubenswrapper[4852]: E1210 11:52:47.169931 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:47 crc kubenswrapper[4852]: I1210 11:52:47.614017 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:47 crc kubenswrapper[4852]: E1210 11:52:47.614180 4852 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:47 crc kubenswrapper[4852]: E1210 11:52:47.614434 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs podName:d4917776-2f46-46af-bd13-db5745bfdbf0 nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.614414936 +0000 UTC m=+69.699940160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs") pod "network-metrics-daemon-bjxbn" (UID: "d4917776-2f46-46af-bd13-db5745bfdbf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.169109 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.169266 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.169511 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.169515 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.169626 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.169693 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.320674 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.320847 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.320875 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:20.320840422 +0000 UTC m=+86.406365686 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.320944 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.320974 4852 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.321078 4852 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.321081 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:53:20.321050847 +0000 UTC m=+86.406576121 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.321186 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-10 11:53:20.32116682 +0000 UTC m=+86.406692054 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.422294 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.422386 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.422495 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.422525 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.422543 4852 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.422495 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.422591 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-10 11:53:20.4225762 +0000 UTC m=+86.508101434 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.422672 4852 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.422715 4852 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:48 crc kubenswrapper[4852]: E1210 11:52:48.422849 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-10 11:53:20.422813346 +0000 UTC m=+86.508338610 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.816385 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 10 11:52:48 crc kubenswrapper[4852]: I1210 11:52:48.826293 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 10 11:52:49 crc kubenswrapper[4852]: I1210 11:52:49.169739 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:49 crc kubenswrapper[4852]: E1210 11:52:49.169986 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:50 crc kubenswrapper[4852]: I1210 11:52:50.168882 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:50 crc kubenswrapper[4852]: I1210 11:52:50.168907 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:50 crc kubenswrapper[4852]: I1210 11:52:50.169008 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:50 crc kubenswrapper[4852]: E1210 11:52:50.169002 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:50 crc kubenswrapper[4852]: E1210 11:52:50.169198 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:50 crc kubenswrapper[4852]: E1210 11:52:50.169296 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:50 crc kubenswrapper[4852]: I1210 11:52:50.388526 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bjxbn"] Dec 10 11:52:50 crc kubenswrapper[4852]: I1210 11:52:50.388661 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:50 crc kubenswrapper[4852]: E1210 11:52:50.388755 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:52 crc kubenswrapper[4852]: I1210 11:52:52.168800 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:52 crc kubenswrapper[4852]: I1210 11:52:52.168845 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:52 crc kubenswrapper[4852]: I1210 11:52:52.168943 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:52 crc kubenswrapper[4852]: E1210 11:52:52.168958 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:52 crc kubenswrapper[4852]: I1210 11:52:52.169033 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:52 crc kubenswrapper[4852]: E1210 11:52:52.169178 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:52 crc kubenswrapper[4852]: E1210 11:52:52.169338 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:52 crc kubenswrapper[4852]: E1210 11:52:52.169490 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:52 crc kubenswrapper[4852]: I1210 11:52:52.542999 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 11:52:52 crc kubenswrapper[4852]: I1210 11:52:52.551767 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dfbl6" podStartSLOduration=35.551746543 podStartE2EDuration="35.551746543s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:52.54970237 +0000 UTC m=+58.635227604" watchObservedRunningTime="2025-12-10 11:52:52.551746543 +0000 UTC m=+58.637271767" Dec 10 11:52:52 crc kubenswrapper[4852]: I1210 11:52:52.606763 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=4.606741557 podStartE2EDuration="4.606741557s" podCreationTimestamp="2025-12-10 11:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:52.567323081 +0000 UTC m=+58.652848325" watchObservedRunningTime="2025-12-10 11:52:52.606741557 +0000 UTC m=+58.692266791" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.168737 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.168785 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.168855 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.168920 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:54 crc kubenswrapper[4852]: E1210 11:52:54.169897 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 10 11:52:54 crc kubenswrapper[4852]: E1210 11:52:54.170065 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 10 11:52:54 crc kubenswrapper[4852]: E1210 11:52:54.170156 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 10 11:52:54 crc kubenswrapper[4852]: E1210 11:52:54.170224 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjxbn" podUID="d4917776-2f46-46af-bd13-db5745bfdbf0" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.544730 4852 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.545001 4852 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.582168 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7l2fq"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.582790 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.585805 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s5kzp"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.586350 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.586760 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.587129 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.587192 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.597473 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.601404 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.622626 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.624127 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gmm6c"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.624740 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ndbzv"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.625097 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.625619 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.625770 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.626035 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-97l7t"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.626631 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.626728 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.627075 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.627170 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.627646 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.627664 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.627765 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.627886 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.627911 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.627950 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.628156 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.628215 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.628376 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.628514 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.628638 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.628742 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.628961 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.628954 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-c82cd"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.629761 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wzdw"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.629884 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.630213 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.630419 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.631040 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.631724 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.632841 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.632979 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.633502 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.633624 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.633729 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.634605 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.634842 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.635061 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r7qp9"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.639533 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.641412 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.641993 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.642353 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.642541 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.642818 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.643049 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.643195 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.643270 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.643434 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.643618 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.643993 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.644162 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.644314 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.644473 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.644989 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.646725 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-srk2d"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.646794 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.647428 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.647905 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vlpmq"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.648118 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.648166 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.648435 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.648905 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.649019 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.649380 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.650940 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.653111 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.653296 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.653879 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.673603 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.681032 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.681535 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.681668 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.681800 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.681559 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.681979 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682015 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682036 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682184 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682322 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2dss"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682393 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682558 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682588 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682850 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.682926 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.683033 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.683047 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.683036 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.684440 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.684556 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.684701 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.684733 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.684836 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.684917 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.686453 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.691442 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.691734 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.691818 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.692286 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.693896 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.694079 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.695094 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.695211 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.695328 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.695821 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.695917 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.696012 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.696095 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.696182 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.696219 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.696318 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.696452 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.696714 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c8lcn"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.697222 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.697569 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698414 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698742 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-image-import-ca\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698768 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698801 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-serving-cert\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698818 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c92825-dfcf-4030-8fa7-4326fc350f10-config\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698838 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mm4n\" (UniqueName: \"kubernetes.io/projected/736a1895-9f79-4788-9f63-5b9b3406540d-kube-api-access-7mm4n\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698859 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-node-pullsecrets\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698876 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-oauth-serving-cert\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698893 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng888\" (UniqueName: \"kubernetes.io/projected/eaf52478-5cc3-48c5-9f24-fc1ad41a3601-kube-api-access-ng888\") pod \"downloads-7954f5f757-ndbzv\" (UID: \"eaf52478-5cc3-48c5-9f24-fc1ad41a3601\") " pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698911 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlsvg\" (UniqueName: \"kubernetes.io/projected/0523d611-7b4c-4293-b657-e076ee51aed2-kube-api-access-dlsvg\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698926 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0523d611-7b4c-4293-b657-e076ee51aed2-auth-proxy-config\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698946 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwb9r\" (UniqueName: \"kubernetes.io/projected/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-kube-api-access-xwb9r\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698965 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-config\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698982 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c9c92825-dfcf-4030-8fa7-4326fc350f10-images\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699000 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-service-ca\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699026 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhl9l\" (UniqueName: \"kubernetes.io/projected/538daf76-3827-4747-bffb-1106c125238c-kube-api-access-nhl9l\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699045 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-audit-dir\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699061 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699081 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-etcd-client\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699100 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-audit\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699115 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-trusted-ca-bundle\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699137 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-config\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699154 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-console-config\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699171 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbkgj\" (UniqueName: \"kubernetes.io/projected/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-kube-api-access-xbkgj\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699189 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s2lh\" (UniqueName: \"kubernetes.io/projected/18ca10bb-9e99-4051-a8d6-197657d74d3f-kube-api-access-4s2lh\") pod \"cluster-samples-operator-665b6dd947-ldgnq\" (UID: \"18ca10bb-9e99-4051-a8d6-197657d74d3f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699207 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76310b37-2f80-4da3-8b7e-8dde4ce8117c-serving-cert\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699225 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0523d611-7b4c-4293-b657-e076ee51aed2-config\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699260 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/538daf76-3827-4747-bffb-1106c125238c-serving-cert\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699277 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-oauth-config\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.699293 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-serving-cert\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.700142 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw6v5\" (UniqueName: \"kubernetes.io/projected/76310b37-2f80-4da3-8b7e-8dde4ce8117c-kube-api-access-lw6v5\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.700182 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18ca10bb-9e99-4051-a8d6-197657d74d3f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldgnq\" (UID: \"18ca10bb-9e99-4051-a8d6-197657d74d3f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.700200 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clq4\" (UniqueName: \"kubernetes.io/projected/c9c92825-dfcf-4030-8fa7-4326fc350f10-kube-api-access-6clq4\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.700225 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-encryption-config\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.700255 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-config\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.700273 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-service-ca-bundle\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.700291 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-serving-cert\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.714212 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-client-ca\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.714276 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c92825-dfcf-4030-8fa7-4326fc350f10-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.714333 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-etcd-serving-ca\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.714357 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0523d611-7b4c-4293-b657-e076ee51aed2-machine-approver-tls\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.714379 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.703006 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.698949 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.708799 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.715221 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.715360 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.715423 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.716907 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.721473 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.724202 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.725541 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.728142 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.728440 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.729261 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.729395 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.729848 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.730058 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.730399 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.733737 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.734922 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.736737 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.738761 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.750870 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.755800 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.767057 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.767370 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.774749 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.778691 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.780254 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s5kzp"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.780282 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.780897 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.781294 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.781428 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.781971 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.787722 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.782245 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.782106 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.783151 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.795095 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wqp6t"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.795487 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.795807 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.796098 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.796215 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pz66"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.796628 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.796762 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.796924 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.797565 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.797718 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.801551 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.802033 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.802846 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.803249 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m2vtx"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.803642 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.803790 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.804336 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.806919 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.807412 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.810012 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4cv5l"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.810800 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.813330 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816615 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c92825-dfcf-4030-8fa7-4326fc350f10-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816670 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-etcd-serving-ca\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816699 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0523d611-7b4c-4293-b657-e076ee51aed2-machine-approver-tls\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816742 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816771 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-image-import-ca\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816798 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816829 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-serving-cert\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816849 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c92825-dfcf-4030-8fa7-4326fc350f10-config\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816872 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mm4n\" (UniqueName: \"kubernetes.io/projected/736a1895-9f79-4788-9f63-5b9b3406540d-kube-api-access-7mm4n\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816906 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng888\" (UniqueName: \"kubernetes.io/projected/eaf52478-5cc3-48c5-9f24-fc1ad41a3601-kube-api-access-ng888\") pod \"downloads-7954f5f757-ndbzv\" (UID: \"eaf52478-5cc3-48c5-9f24-fc1ad41a3601\") " pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816927 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-node-pullsecrets\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816952 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-oauth-serving-cert\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.816981 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlsvg\" (UniqueName: \"kubernetes.io/projected/0523d611-7b4c-4293-b657-e076ee51aed2-kube-api-access-dlsvg\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.817010 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0523d611-7b4c-4293-b657-e076ee51aed2-auth-proxy-config\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.817038 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwb9r\" (UniqueName: \"kubernetes.io/projected/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-kube-api-access-xwb9r\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.817067 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-trusted-ca\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.818290 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-etcd-serving-ca\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.818553 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c9c92825-dfcf-4030-8fa7-4326fc350f10-images\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.818651 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-service-ca\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.818742 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-config\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.818837 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-serving-cert\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.818958 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhl9l\" (UniqueName: \"kubernetes.io/projected/538daf76-3827-4747-bffb-1106c125238c-kube-api-access-nhl9l\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819051 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6qzg\" (UniqueName: \"kubernetes.io/projected/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-kube-api-access-c6qzg\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819144 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-audit-dir\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819249 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819389 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-etcd-client\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819509 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-audit\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819608 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-trusted-ca-bundle\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819696 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-config\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819777 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-console-config\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819867 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbkgj\" (UniqueName: \"kubernetes.io/projected/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-kube-api-access-xbkgj\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.819962 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s2lh\" (UniqueName: \"kubernetes.io/projected/18ca10bb-9e99-4051-a8d6-197657d74d3f-kube-api-access-4s2lh\") pod \"cluster-samples-operator-665b6dd947-ldgnq\" (UID: \"18ca10bb-9e99-4051-a8d6-197657d74d3f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820041 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76310b37-2f80-4da3-8b7e-8dde4ce8117c-serving-cert\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820168 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0523d611-7b4c-4293-b657-e076ee51aed2-config\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820277 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/538daf76-3827-4747-bffb-1106c125238c-serving-cert\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820369 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-oauth-config\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820462 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-serving-cert\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820552 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-config\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820641 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw6v5\" (UniqueName: \"kubernetes.io/projected/76310b37-2f80-4da3-8b7e-8dde4ce8117c-kube-api-access-lw6v5\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820728 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18ca10bb-9e99-4051-a8d6-197657d74d3f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldgnq\" (UID: \"18ca10bb-9e99-4051-a8d6-197657d74d3f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820788 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-audit-dir\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820815 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clq4\" (UniqueName: \"kubernetes.io/projected/c9c92825-dfcf-4030-8fa7-4326fc350f10-kube-api-access-6clq4\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.820981 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-encryption-config\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.821620 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.822426 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-config\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.823059 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-service-ca-bundle\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.823119 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-serving-cert\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.823305 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-client-ca\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.823815 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c92825-dfcf-4030-8fa7-4326fc350f10-config\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.825676 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-image-import-ca\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.826050 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-client-ca\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.826646 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.826720 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c92825-dfcf-4030-8fa7-4326fc350f10-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.827681 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-service-ca-bundle\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.828849 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c9c92825-dfcf-4030-8fa7-4326fc350f10-images\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.828968 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-node-pullsecrets\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.829053 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0523d611-7b4c-4293-b657-e076ee51aed2-machine-approver-tls\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.829173 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/538daf76-3827-4747-bffb-1106c125238c-config\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.829607 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.830085 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-etcd-client\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.830540 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-serving-cert\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.831413 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-config\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.831452 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0523d611-7b4c-4293-b657-e076ee51aed2-auth-proxy-config\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.831903 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-service-ca\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.832315 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-console-config\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.832336 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-config\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.832649 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-oauth-serving-cert\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.832721 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-serving-cert\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.832810 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0523d611-7b4c-4293-b657-e076ee51aed2-config\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.832883 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.833256 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-trusted-ca-bundle\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.833573 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.833862 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.834001 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-audit\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.835380 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-serving-cert\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.836096 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76310b37-2f80-4da3-8b7e-8dde4ce8117c-serving-cert\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.837336 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-encryption-config\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.838054 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18ca10bb-9e99-4051-a8d6-197657d74d3f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldgnq\" (UID: \"18ca10bb-9e99-4051-a8d6-197657d74d3f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.839038 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-oauth-config\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.839112 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.840535 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wzdw"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.842150 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7l2fq"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.843344 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/538daf76-3827-4747-bffb-1106c125238c-serving-cert\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.843469 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r7qp9"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.847072 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.849907 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ndbzv"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.852384 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.853631 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.855044 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.856482 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-c82cd"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.857705 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.858878 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.863553 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-p7m7x"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.866168 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.867046 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.872677 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gmm6c"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.875467 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c8lcn"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.879686 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-srk2d"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.883473 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2dss"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.884690 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.885646 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.888171 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vlpmq"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.889682 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-97l7t"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.890705 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pz66"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.891892 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.893697 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.895374 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-99cx6"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.896844 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.897021 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.897653 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.898794 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cjzjt"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.899862 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cjzjt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.900442 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.900974 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.902139 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-99cx6"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.903177 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cjzjt"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.904496 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.905299 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.906014 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.907050 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m2vtx"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.908734 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.910287 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4cv5l"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.911385 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.912625 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.914664 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.915296 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.916299 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rd2bl"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.917308 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.918556 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2tlvx"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.920220 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2tlvx"] Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.920435 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.924590 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6qzg\" (UniqueName: \"kubernetes.io/projected/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-kube-api-access-c6qzg\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.924828 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-config\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.925132 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-trusted-ca\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.925267 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-serving-cert\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.926539 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-config\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.927098 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.929691 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-trusted-ca\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.931922 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-serving-cert\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.946061 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.968036 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 10 11:52:54 crc kubenswrapper[4852]: I1210 11:52:54.985464 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.005401 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.025270 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.045617 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.065327 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.085432 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.105520 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.147920 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.165645 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.185142 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.212367 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.225773 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.245519 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.266109 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.285635 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.306254 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.325395 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.345832 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.365259 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.385353 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.406213 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.425324 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.445438 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.466179 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.486209 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.505983 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.525182 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.547143 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.565133 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.585949 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.605144 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.624804 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.645706 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.665764 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.686803 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.705944 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.725598 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.745435 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.765841 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.785336 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.804066 4852 request.go:700] Waited for 1.001709768s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.805529 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.825353 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.845679 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.865836 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.885691 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.906710 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.925450 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.946427 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.969266 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 10 11:52:55 crc kubenswrapper[4852]: I1210 11:52:55.985416 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.005386 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.025761 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.046412 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.066078 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.085493 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.105007 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.125463 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.146036 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.165733 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.169628 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.169691 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.169745 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.170101 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.185479 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.205833 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.225980 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.253650 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.265545 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.300250 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng888\" (UniqueName: \"kubernetes.io/projected/eaf52478-5cc3-48c5-9f24-fc1ad41a3601-kube-api-access-ng888\") pod \"downloads-7954f5f757-ndbzv\" (UID: \"eaf52478-5cc3-48c5-9f24-fc1ad41a3601\") " pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.324009 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clq4\" (UniqueName: \"kubernetes.io/projected/c9c92825-dfcf-4030-8fa7-4326fc350f10-kube-api-access-6clq4\") pod \"machine-api-operator-5694c8668f-gmm6c\" (UID: \"c9c92825-dfcf-4030-8fa7-4326fc350f10\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.339634 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mm4n\" (UniqueName: \"kubernetes.io/projected/736a1895-9f79-4788-9f63-5b9b3406540d-kube-api-access-7mm4n\") pod \"console-f9d7485db-c82cd\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.359885 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw6v5\" (UniqueName: \"kubernetes.io/projected/76310b37-2f80-4da3-8b7e-8dde4ce8117c-kube-api-access-lw6v5\") pod \"route-controller-manager-6576b87f9c-c6bdw\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.379807 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbkgj\" (UniqueName: \"kubernetes.io/projected/1cc9306f-7986-4543-b2d2-4a24fbbda5ca-kube-api-access-xbkgj\") pod \"apiserver-76f77b778f-7l2fq\" (UID: \"1cc9306f-7986-4543-b2d2-4a24fbbda5ca\") " pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.399208 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhl9l\" (UniqueName: \"kubernetes.io/projected/538daf76-3827-4747-bffb-1106c125238c-kube-api-access-nhl9l\") pod \"authentication-operator-69f744f599-s5kzp\" (UID: \"538daf76-3827-4747-bffb-1106c125238c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.421151 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlsvg\" (UniqueName: \"kubernetes.io/projected/0523d611-7b4c-4293-b657-e076ee51aed2-kube-api-access-dlsvg\") pod \"machine-approver-56656f9798-wxkxt\" (UID: \"0523d611-7b4c-4293-b657-e076ee51aed2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.437719 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.443600 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwb9r\" (UniqueName: \"kubernetes.io/projected/5bda3bbd-e919-404e-ae6f-fa2beef3f56a-kube-api-access-xwb9r\") pod \"openshift-config-operator-7777fb866f-97l7t\" (UID: \"5bda3bbd-e919-404e-ae6f-fa2beef3f56a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.446824 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.465319 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.484270 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.485306 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.506799 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.515656 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.525753 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.551243 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.563673 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.564516 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s2lh\" (UniqueName: \"kubernetes.io/projected/18ca10bb-9e99-4051-a8d6-197657d74d3f-kube-api-access-4s2lh\") pod \"cluster-samples-operator-665b6dd947-ldgnq\" (UID: \"18ca10bb-9e99-4051-a8d6-197657d74d3f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.582886 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.585736 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.592254 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.605528 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.614072 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.625612 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.646222 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.665332 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.689832 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.707818 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.726185 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.745401 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.764845 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.784945 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.805157 4852 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.823713 4852 request.go:700] Waited for 1.902969832s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.825908 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.832054 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.845289 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.878968 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6qzg\" (UniqueName: \"kubernetes.io/projected/550d4cf0-5f6e-4fa1-94d7-1d662d7fff78-kube-api-access-c6qzg\") pod \"console-operator-58897d9998-vlpmq\" (UID: \"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78\") " pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.905520 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.925647 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.945618 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.965936 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 10 11:52:56 crc kubenswrapper[4852]: I1210 11:52:56.985691 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.006050 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.460721 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.464092 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.464164 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db6ae1b8-eb2a-4790-a39f-37206d33525c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.464215 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-certificates\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.464296 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-bound-sa-token\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.464349 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db6ae1b8-eb2a-4790-a39f-37206d33525c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.464397 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rkd\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-kube-api-access-56rkd\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.464430 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-trusted-ca\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.464480 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-tls\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: E1210 11:52:57.465042 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:57.965017386 +0000 UTC m=+64.050542610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.557066 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" event={"ID":"0523d611-7b4c-4293-b657-e076ee51aed2","Type":"ContainerStarted","Data":"a348ab90b0af28c351598fb14330c761171233ee11084b9a12daa68872820c43"} Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565344 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565556 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addd872d-85f4-4a82-b643-ab0bc2c5d154-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565587 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzd7\" (UniqueName: \"kubernetes.io/projected/addd872d-85f4-4a82-b643-ab0bc2c5d154-kube-api-access-qrzd7\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565662 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565694 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56rkd\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-kube-api-access-56rkd\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565718 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95cac14e-2b1e-4728-a2d8-8e4613a0f330-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565738 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565758 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-client\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565812 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-trusted-ca\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565834 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95cac14e-2b1e-4728-a2d8-8e4613a0f330-config\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565855 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6g9h\" (UniqueName: \"kubernetes.io/projected/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-kube-api-access-l6g9h\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565877 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-config\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565897 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-policies\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565932 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addd872d-85f4-4a82-b643-ab0bc2c5d154-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.565954 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1081fdb-dd9a-49ac-9749-001eec6fb12d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566001 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lz8w\" (UniqueName: \"kubernetes.io/projected/fedc668c-0d6e-42ef-bf40-93a595875617-kube-api-access-7lz8w\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566022 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-ca\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566079 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400678dc-74d6-4b93-aa8a-7468710877d6-serving-cert\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566106 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/addd872d-85f4-4a82-b643-ab0bc2c5d154-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566131 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566152 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc7c8\" (UniqueName: \"kubernetes.io/projected/400678dc-74d6-4b93-aa8a-7468710877d6-kube-api-access-qc7c8\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566173 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566210 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-tls\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566252 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-config\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566275 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566306 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwxc4\" (UniqueName: \"kubernetes.io/projected/63f9da9c-779b-4d17-829b-0310c9d360b2-kube-api-access-wwxc4\") pod \"dns-operator-744455d44c-c8lcn\" (UID: \"63f9da9c-779b-4d17-829b-0310c9d360b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566353 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t7zh\" (UniqueName: \"kubernetes.io/projected/f0b90ca8-b975-49a2-9373-715ea46eeabf-kube-api-access-5t7zh\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566376 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566430 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95cac14e-2b1e-4728-a2d8-8e4613a0f330-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566453 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566493 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db6ae1b8-eb2a-4790-a39f-37206d33525c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566518 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xslx\" (UniqueName: \"kubernetes.io/projected/e1081fdb-dd9a-49ac-9749-001eec6fb12d-kube-api-access-8xslx\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566565 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566587 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-service-ca\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566608 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-dir\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566631 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566655 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566720 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedc668c-0d6e-42ef-bf40-93a595875617-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566743 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566766 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-certificates\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566788 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566902 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-bound-sa-token\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566937 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63f9da9c-779b-4d17-829b-0310c9d360b2-metrics-tls\") pod \"dns-operator-744455d44c-c8lcn\" (UID: \"63f9da9c-779b-4d17-829b-0310c9d360b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566961 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedc668c-0d6e-42ef-bf40-93a595875617-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.566982 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b90ca8-b975-49a2-9373-715ea46eeabf-serving-cert\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.567670 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.567709 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.567824 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1081fdb-dd9a-49ac-9749-001eec6fb12d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.567853 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78h9t\" (UniqueName: \"kubernetes.io/projected/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-kube-api-access-78h9t\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.567938 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db6ae1b8-eb2a-4790-a39f-37206d33525c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.567998 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.568026 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.568090 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-client-ca\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.568112 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-config\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: E1210 11:52:57.568714 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.068684104 +0000 UTC m=+64.154209348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.569797 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db6ae1b8-eb2a-4790-a39f-37206d33525c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.572668 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-certificates\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.576575 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-trusted-ca\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.589038 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db6ae1b8-eb2a-4790-a39f-37206d33525c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.597818 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-bound-sa-token\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.603062 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-tls\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.631082 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rkd\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-kube-api-access-56rkd\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.670796 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedc668c-0d6e-42ef-bf40-93a595875617-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.670876 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.670935 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-default-certificate\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.670965 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cv46\" (UniqueName: \"kubernetes.io/projected/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-kube-api-access-4cv46\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.670987 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh7nc\" (UniqueName: \"kubernetes.io/projected/9916fff5-914f-439b-886d-844aa8739d83-kube-api-access-kh7nc\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671040 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-metrics-certs\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671059 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-plugins-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671177 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-csi-data-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671220 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671275 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671303 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9916fff5-914f-439b-886d-844aa8739d83-serving-cert\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671341 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e434f3f3-87cf-420b-822e-b0691ed878fb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pz66\" (UID: \"e434f3f3-87cf-420b-822e-b0691ed878fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671363 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-audit-policies\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.671974 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-registration-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672049 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwmsg\" (UniqueName: \"kubernetes.io/projected/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-kube-api-access-bwmsg\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672089 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63f9da9c-779b-4d17-829b-0310c9d360b2-metrics-tls\") pod \"dns-operator-744455d44c-c8lcn\" (UID: \"63f9da9c-779b-4d17-829b-0310c9d360b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672207 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b90ca8-b975-49a2-9373-715ea46eeabf-serving-cert\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672339 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx9sg\" (UniqueName: \"kubernetes.io/projected/219bfbde-1edd-4898-988c-93697c6223f9-kube-api-access-zx9sg\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672372 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvrmg\" (UniqueName: \"kubernetes.io/projected/8bc2ea7c-2f45-49ac-b683-c57d84d8e758-kube-api-access-lvrmg\") pod \"control-plane-machine-set-operator-78cbb6b69f-n8gzr\" (UID: \"8bc2ea7c-2f45-49ac-b683-c57d84d8e758\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672465 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedc668c-0d6e-42ef-bf40-93a595875617-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672511 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672639 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdw6f\" (UniqueName: \"kubernetes.io/projected/2141a0da-f43d-4eb3-90a0-338623412c49-kube-api-access-xdw6f\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672740 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672784 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1081fdb-dd9a-49ac-9749-001eec6fb12d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672815 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99s96\" (UniqueName: \"kubernetes.io/projected/44d784f7-186c-4b34-aaf4-97fbedbbc7af-kube-api-access-99s96\") pod \"migrator-59844c95c7-xrtjj\" (UID: \"44d784f7-186c-4b34-aaf4-97fbedbbc7af\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672882 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78h9t\" (UniqueName: \"kubernetes.io/projected/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-kube-api-access-78h9t\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672916 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3d28862-df31-4d6c-af29-5fa5b49104ae-secret-volume\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672948 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3dc333d1-28e6-443f-b1aa-a91b83aade24-tmpfs\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.672983 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-config\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673018 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-client-ca\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673074 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addd872d-85f4-4a82-b643-ab0bc2c5d154-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673106 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f37d7b2-67f5-492d-8419-4f47cd848151-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hlxf6\" (UID: \"7f37d7b2-67f5-492d-8419-4f47cd848151\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673211 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/165c4011-dd67-4dce-8cd4-63de1f286dbe-audit-dir\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673287 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95cac14e-2b1e-4728-a2d8-8e4613a0f330-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673330 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-client\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673369 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6g9h\" (UniqueName: \"kubernetes.io/projected/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-kube-api-access-l6g9h\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673411 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-config\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673439 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-policies\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673472 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwh9\" (UniqueName: \"kubernetes.io/projected/00fc488a-3478-434a-93f4-bdd59b51ecbd-kube-api-access-7rwh9\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673499 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-encryption-config\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673543 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addd872d-85f4-4a82-b643-ab0bc2c5d154-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673592 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lz8w\" (UniqueName: \"kubernetes.io/projected/fedc668c-0d6e-42ef-bf40-93a595875617-kube-api-access-7lz8w\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673623 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rs7p\" (UniqueName: \"kubernetes.io/projected/a8e54706-270f-43ac-936d-3a00ff537e09-kube-api-access-4rs7p\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673654 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-socket-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673680 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/219bfbde-1edd-4898-988c-93697c6223f9-config-volume\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673711 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673739 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3d28862-df31-4d6c-af29-5fa5b49104ae-config-volume\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673765 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/611835ea-114d-4fca-be9a-d798ccdacdc8-images\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673791 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-config\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673815 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-stats-auth\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673839 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673871 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t7zh\" (UniqueName: \"kubernetes.io/projected/f0b90ca8-b975-49a2-9373-715ea46eeabf-kube-api-access-5t7zh\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673897 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-service-ca-bundle\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673935 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673967 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673989 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d331406-40f3-46fa-b660-f4cf0813d332-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674017 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4551a71c-b322-4aff-9487-a558d411643f-profile-collector-cert\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674048 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xslx\" (UniqueName: \"kubernetes.io/projected/e1081fdb-dd9a-49ac-9749-001eec6fb12d-kube-api-access-8xslx\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674076 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npvnx\" (UniqueName: \"kubernetes.io/projected/e434f3f3-87cf-420b-822e-b0691ed878fb-kube-api-access-npvnx\") pod \"multus-admission-controller-857f4d67dd-7pz66\" (UID: \"e434f3f3-87cf-420b-822e-b0691ed878fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674104 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674190 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-etcd-client\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674220 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674282 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rfsc\" (UniqueName: \"kubernetes.io/projected/3dc333d1-28e6-443f-b1aa-a91b83aade24-kube-api-access-9rfsc\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674334 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nm85\" (UniqueName: \"kubernetes.io/projected/a3d28862-df31-4d6c-af29-5fa5b49104ae-kube-api-access-8nm85\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674362 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc2ea7c-2f45-49ac-b683-c57d84d8e758-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n8gzr\" (UID: \"8bc2ea7c-2f45-49ac-b683-c57d84d8e758\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674408 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-srv-cert\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674437 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-metrics-tls\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674463 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/611835ea-114d-4fca-be9a-d798ccdacdc8-proxy-tls\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674486 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4551a71c-b322-4aff-9487-a558d411643f-srv-cert\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674542 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9916fff5-914f-439b-886d-844aa8739d83-config\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674570 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdvvs\" (UniqueName: \"kubernetes.io/projected/4551a71c-b322-4aff-9487-a558d411643f-kube-api-access-hdvvs\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674598 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3d331406-40f3-46fa-b660-f4cf0813d332-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674628 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedc668c-0d6e-42ef-bf40-93a595875617-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674679 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674706 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3dc333d1-28e6-443f-b1aa-a91b83aade24-webhook-cert\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674737 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brpvq\" (UniqueName: \"kubernetes.io/projected/7f37d7b2-67f5-492d-8419-4f47cd848151-kube-api-access-brpvq\") pod \"package-server-manager-789f6589d5-hlxf6\" (UID: \"7f37d7b2-67f5-492d-8419-4f47cd848151\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674781 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-trusted-ca\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674785 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674816 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674847 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674874 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674906 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674941 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00fc488a-3478-434a-93f4-bdd59b51ecbd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.674971 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675001 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzd7\" (UniqueName: \"kubernetes.io/projected/addd872d-85f4-4a82-b643-ab0bc2c5d154-kube-api-access-qrzd7\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675041 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675080 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675180 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675228 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95cac14e-2b1e-4728-a2d8-8e4613a0f330-config\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675280 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcnhk\" (UniqueName: \"kubernetes.io/projected/611835ea-114d-4fca-be9a-d798ccdacdc8-kube-api-access-fcnhk\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675323 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1081fdb-dd9a-49ac-9749-001eec6fb12d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675357 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5cf\" (UniqueName: \"kubernetes.io/projected/1919e18e-d914-4ee7-8bf4-6de02e6760c2-kube-api-access-qq5cf\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675386 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675463 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-config\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.675471 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3dc333d1-28e6-443f-b1aa-a91b83aade24-apiservice-cert\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.676322 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.676314 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-policies\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.673891 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.681265 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1081fdb-dd9a-49ac-9749-001eec6fb12d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.681355 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/622ed8c8-e46e-41d7-8b9b-cbc637d25dc4-cert\") pod \"ingress-canary-cjzjt\" (UID: \"622ed8c8-e46e-41d7-8b9b-cbc637d25dc4\") " pod="openshift-ingress-canary/ingress-canary-cjzjt" Dec 10 11:52:57 crc kubenswrapper[4852]: E1210 11:52:57.681426 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.181397503 +0000 UTC m=+64.266922937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.681445 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1081fdb-dd9a-49ac-9749-001eec6fb12d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.681562 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.681605 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q52q\" (UniqueName: \"kubernetes.io/projected/622ed8c8-e46e-41d7-8b9b-cbc637d25dc4-kube-api-access-2q52q\") pod \"ingress-canary-cjzjt\" (UID: \"622ed8c8-e46e-41d7-8b9b-cbc637d25dc4\") " pod="openshift-ingress-canary/ingress-canary-cjzjt" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.681701 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-ca\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.681944 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00fc488a-3478-434a-93f4-bdd59b51ecbd-proxy-tls\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682314 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400678dc-74d6-4b93-aa8a-7468710877d6-serving-cert\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682445 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbs9\" (UniqueName: \"kubernetes.io/projected/165c4011-dd67-4dce-8cd4-63de1f286dbe-kube-api-access-wzbs9\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682485 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/addd872d-85f4-4a82-b643-ab0bc2c5d154-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682535 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682560 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-serving-cert\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682588 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc7c8\" (UniqueName: \"kubernetes.io/projected/400678dc-74d6-4b93-aa8a-7468710877d6-kube-api-access-qc7c8\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682604 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95cac14e-2b1e-4728-a2d8-8e4613a0f330-config\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682654 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682701 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/611835ea-114d-4fca-be9a-d798ccdacdc8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682770 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8e54706-270f-43ac-936d-3a00ff537e09-node-bootstrap-token\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682799 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwxc4\" (UniqueName: \"kubernetes.io/projected/63f9da9c-779b-4d17-829b-0310c9d360b2-kube-api-access-wwxc4\") pod \"dns-operator-744455d44c-c8lcn\" (UID: \"63f9da9c-779b-4d17-829b-0310c9d360b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682822 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682847 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq9vl\" (UniqueName: \"kubernetes.io/projected/3d331406-40f3-46fa-b660-f4cf0813d332-kube-api-access-sq9vl\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682934 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95cac14e-2b1e-4728-a2d8-8e4613a0f330-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682956 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.682983 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/219bfbde-1edd-4898-988c-93697c6223f9-metrics-tls\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683255 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-mountpoint-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683295 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm9vq\" (UniqueName: \"kubernetes.io/projected/0a003606-92af-4a70-aa36-04637896c343-kube-api-access-rm9vq\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683586 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3d331406-40f3-46fa-b660-f4cf0813d332-ready\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683623 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683650 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a003606-92af-4a70-aa36-04637896c343-signing-key\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683675 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a003606-92af-4a70-aa36-04637896c343-signing-cabundle\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683700 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxh8m\" (UniqueName: \"kubernetes.io/projected/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-kube-api-access-sxh8m\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683749 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8e54706-270f-43ac-936d-3a00ff537e09-certs\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683894 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-service-ca\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.683955 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-dir\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.684288 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-dir\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.685040 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.689814 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.696036 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedc668c-0d6e-42ef-bf40-93a595875617-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.696753 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7l2fq"] Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.697404 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addd872d-85f4-4a82-b643-ab0bc2c5d154-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.691080 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-config\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.700005 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.700214 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-client\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.700398 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-config\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.700467 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-client-ca\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.700765 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63f9da9c-779b-4d17-829b-0310c9d360b2-metrics-tls\") pod \"dns-operator-744455d44c-c8lcn\" (UID: \"63f9da9c-779b-4d17-829b-0310c9d360b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.701047 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.701676 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95cac14e-2b1e-4728-a2d8-8e4613a0f330-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.702202 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400678dc-74d6-4b93-aa8a-7468710877d6-serving-cert\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.703011 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78h9t\" (UniqueName: \"kubernetes.io/projected/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-kube-api-access-78h9t\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.709124 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.709264 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-service-ca\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.709908 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.709953 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f0b90ca8-b975-49a2-9373-715ea46eeabf-etcd-ca\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.710364 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b90ca8-b975-49a2-9373-715ea46eeabf-serving-cert\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.710714 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.711993 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.713955 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.714524 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6g9h\" (UniqueName: \"kubernetes.io/projected/d02c986d-d2eb-4c7d-b864-ea5946ef7fba-kube-api-access-l6g9h\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgvrl\" (UID: \"d02c986d-d2eb-4c7d-b864-ea5946ef7fba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.715189 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lz8w\" (UniqueName: \"kubernetes.io/projected/fedc668c-0d6e-42ef-bf40-93a595875617-kube-api-access-7lz8w\") pod \"openshift-apiserver-operator-796bbdcf4f-ncphg\" (UID: \"fedc668c-0d6e-42ef-bf40-93a595875617\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.717561 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.722675 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r7qp9\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.735642 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/addd872d-85f4-4a82-b643-ab0bc2c5d154-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.786868 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:57 crc kubenswrapper[4852]: E1210 11:52:57.787146 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.287103414 +0000 UTC m=+64.372628638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787209 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3dc333d1-28e6-443f-b1aa-a91b83aade24-webhook-cert\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787254 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brpvq\" (UniqueName: \"kubernetes.io/projected/7f37d7b2-67f5-492d-8419-4f47cd848151-kube-api-access-brpvq\") pod \"package-server-manager-789f6589d5-hlxf6\" (UID: \"7f37d7b2-67f5-492d-8419-4f47cd848151\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787281 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-trusted-ca\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787299 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787314 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787335 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787354 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00fc488a-3478-434a-93f4-bdd59b51ecbd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787385 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787406 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcnhk\" (UniqueName: \"kubernetes.io/projected/611835ea-114d-4fca-be9a-d798ccdacdc8-kube-api-access-fcnhk\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787422 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5cf\" (UniqueName: \"kubernetes.io/projected/1919e18e-d914-4ee7-8bf4-6de02e6760c2-kube-api-access-qq5cf\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787436 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787452 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3dc333d1-28e6-443f-b1aa-a91b83aade24-apiservice-cert\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787487 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/622ed8c8-e46e-41d7-8b9b-cbc637d25dc4-cert\") pod \"ingress-canary-cjzjt\" (UID: \"622ed8c8-e46e-41d7-8b9b-cbc637d25dc4\") " pod="openshift-ingress-canary/ingress-canary-cjzjt" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787508 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787525 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q52q\" (UniqueName: \"kubernetes.io/projected/622ed8c8-e46e-41d7-8b9b-cbc637d25dc4-kube-api-access-2q52q\") pod \"ingress-canary-cjzjt\" (UID: \"622ed8c8-e46e-41d7-8b9b-cbc637d25dc4\") " pod="openshift-ingress-canary/ingress-canary-cjzjt" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787547 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00fc488a-3478-434a-93f4-bdd59b51ecbd-proxy-tls\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787563 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-serving-cert\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787580 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbs9\" (UniqueName: \"kubernetes.io/projected/165c4011-dd67-4dce-8cd4-63de1f286dbe-kube-api-access-wzbs9\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787604 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8e54706-270f-43ac-936d-3a00ff537e09-node-bootstrap-token\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787625 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/611835ea-114d-4fca-be9a-d798ccdacdc8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787648 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq9vl\" (UniqueName: \"kubernetes.io/projected/3d331406-40f3-46fa-b660-f4cf0813d332-kube-api-access-sq9vl\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787666 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787691 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-mountpoint-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787713 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/219bfbde-1edd-4898-988c-93697c6223f9-metrics-tls\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787743 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm9vq\" (UniqueName: \"kubernetes.io/projected/0a003606-92af-4a70-aa36-04637896c343-kube-api-access-rm9vq\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787765 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3d331406-40f3-46fa-b660-f4cf0813d332-ready\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787792 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a003606-92af-4a70-aa36-04637896c343-signing-key\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787807 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a003606-92af-4a70-aa36-04637896c343-signing-cabundle\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787823 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxh8m\" (UniqueName: \"kubernetes.io/projected/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-kube-api-access-sxh8m\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787838 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8e54706-270f-43ac-936d-3a00ff537e09-certs\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787859 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787875 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cv46\" (UniqueName: \"kubernetes.io/projected/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-kube-api-access-4cv46\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787890 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh7nc\" (UniqueName: \"kubernetes.io/projected/9916fff5-914f-439b-886d-844aa8739d83-kube-api-access-kh7nc\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787907 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-default-certificate\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787926 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-metrics-certs\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787953 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-plugins-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.787985 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-csi-data-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788001 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9916fff5-914f-439b-886d-844aa8739d83-serving-cert\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788431 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e434f3f3-87cf-420b-822e-b0691ed878fb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pz66\" (UID: \"e434f3f3-87cf-420b-822e-b0691ed878fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788447 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-audit-policies\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788463 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-registration-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788479 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwmsg\" (UniqueName: \"kubernetes.io/projected/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-kube-api-access-bwmsg\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788497 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx9sg\" (UniqueName: \"kubernetes.io/projected/219bfbde-1edd-4898-988c-93697c6223f9-kube-api-access-zx9sg\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788512 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvrmg\" (UniqueName: \"kubernetes.io/projected/8bc2ea7c-2f45-49ac-b683-c57d84d8e758-kube-api-access-lvrmg\") pod \"control-plane-machine-set-operator-78cbb6b69f-n8gzr\" (UID: \"8bc2ea7c-2f45-49ac-b683-c57d84d8e758\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788530 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdw6f\" (UniqueName: \"kubernetes.io/projected/2141a0da-f43d-4eb3-90a0-338623412c49-kube-api-access-xdw6f\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788548 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99s96\" (UniqueName: \"kubernetes.io/projected/44d784f7-186c-4b34-aaf4-97fbedbbc7af-kube-api-access-99s96\") pod \"migrator-59844c95c7-xrtjj\" (UID: \"44d784f7-186c-4b34-aaf4-97fbedbbc7af\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788564 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3d28862-df31-4d6c-af29-5fa5b49104ae-secret-volume\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788580 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3dc333d1-28e6-443f-b1aa-a91b83aade24-tmpfs\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788600 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f37d7b2-67f5-492d-8419-4f47cd848151-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hlxf6\" (UID: \"7f37d7b2-67f5-492d-8419-4f47cd848151\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788627 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/165c4011-dd67-4dce-8cd4-63de1f286dbe-audit-dir\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788655 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwh9\" (UniqueName: \"kubernetes.io/projected/00fc488a-3478-434a-93f4-bdd59b51ecbd-kube-api-access-7rwh9\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788670 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-encryption-config\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788688 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rs7p\" (UniqueName: \"kubernetes.io/projected/a8e54706-270f-43ac-936d-3a00ff537e09-kube-api-access-4rs7p\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788703 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788718 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-socket-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788733 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/219bfbde-1edd-4898-988c-93697c6223f9-config-volume\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788749 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3d28862-df31-4d6c-af29-5fa5b49104ae-config-volume\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788763 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-stats-auth\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788778 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/611835ea-114d-4fca-be9a-d798ccdacdc8-images\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788801 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788821 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-service-ca-bundle\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788839 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d331406-40f3-46fa-b660-f4cf0813d332-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788854 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4551a71c-b322-4aff-9487-a558d411643f-profile-collector-cert\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788875 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npvnx\" (UniqueName: \"kubernetes.io/projected/e434f3f3-87cf-420b-822e-b0691ed878fb-kube-api-access-npvnx\") pod \"multus-admission-controller-857f4d67dd-7pz66\" (UID: \"e434f3f3-87cf-420b-822e-b0691ed878fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788890 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-etcd-client\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788909 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rfsc\" (UniqueName: \"kubernetes.io/projected/3dc333d1-28e6-443f-b1aa-a91b83aade24-kube-api-access-9rfsc\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788924 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nm85\" (UniqueName: \"kubernetes.io/projected/a3d28862-df31-4d6c-af29-5fa5b49104ae-kube-api-access-8nm85\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788941 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc2ea7c-2f45-49ac-b683-c57d84d8e758-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n8gzr\" (UID: \"8bc2ea7c-2f45-49ac-b683-c57d84d8e758\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788957 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-srv-cert\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788973 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-metrics-tls\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.788987 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/611835ea-114d-4fca-be9a-d798ccdacdc8-proxy-tls\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.789000 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4551a71c-b322-4aff-9487-a558d411643f-srv-cert\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.789026 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3d331406-40f3-46fa-b660-f4cf0813d332-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.789041 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9916fff5-914f-439b-886d-844aa8739d83-config\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.789056 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdvvs\" (UniqueName: \"kubernetes.io/projected/4551a71c-b322-4aff-9487-a558d411643f-kube-api-access-hdvvs\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.789861 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.790108 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-trusted-ca\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.790628 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.790942 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00fc488a-3478-434a-93f4-bdd59b51ecbd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.793011 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-mountpoint-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.793999 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95cac14e-2b1e-4728-a2d8-8e4613a0f330-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jzj5j\" (UID: \"95cac14e-2b1e-4728-a2d8-8e4613a0f330\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.794470 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.794628 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3dc333d1-28e6-443f-b1aa-a91b83aade24-webhook-cert\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.794994 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/611835ea-114d-4fca-be9a-d798ccdacdc8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.795161 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/611835ea-114d-4fca-be9a-d798ccdacdc8-images\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.795046 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3d331406-40f3-46fa-b660-f4cf0813d332-ready\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.797631 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3dc333d1-28e6-443f-b1aa-a91b83aade24-tmpfs\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.798554 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-serving-cert\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.799424 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.799789 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-csi-data-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.800183 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d331406-40f3-46fa-b660-f4cf0813d332-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.800827 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-service-ca-bundle\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.801680 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3d28862-df31-4d6c-af29-5fa5b49104ae-config-volume\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.802079 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.803092 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9916fff5-914f-439b-886d-844aa8739d83-serving-cert\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.803832 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3dc333d1-28e6-443f-b1aa-a91b83aade24-apiservice-cert\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.803899 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-plugins-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.804638 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/165c4011-dd67-4dce-8cd4-63de1f286dbe-audit-dir\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.805120 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9916fff5-914f-439b-886d-844aa8739d83-config\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.805485 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/165c4011-dd67-4dce-8cd4-63de1f286dbe-audit-policies\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.805942 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-registration-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.806063 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-srv-cert\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.806417 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2141a0da-f43d-4eb3-90a0-338623412c49-socket-dir\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:57 crc kubenswrapper[4852]: E1210 11:52:57.806511 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.306492199 +0000 UTC m=+64.392017423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.806889 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc2ea7c-2f45-49ac-b683-c57d84d8e758-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n8gzr\" (UID: \"8bc2ea7c-2f45-49ac-b683-c57d84d8e758\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.807405 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/219bfbde-1edd-4898-988c-93697c6223f9-config-volume\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.808908 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-default-certificate\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.809001 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a003606-92af-4a70-aa36-04637896c343-signing-cabundle\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.810401 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-metrics-certs\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.812415 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3d331406-40f3-46fa-b660-f4cf0813d332-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.812565 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t7zh\" (UniqueName: \"kubernetes.io/projected/f0b90ca8-b975-49a2-9373-715ea46eeabf-kube-api-access-5t7zh\") pod \"etcd-operator-b45778765-srk2d\" (UID: \"f0b90ca8-b975-49a2-9373-715ea46eeabf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.812943 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/622ed8c8-e46e-41d7-8b9b-cbc637d25dc4-cert\") pod \"ingress-canary-cjzjt\" (UID: \"622ed8c8-e46e-41d7-8b9b-cbc637d25dc4\") " pod="openshift-ingress-canary/ingress-canary-cjzjt" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.813857 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4551a71c-b322-4aff-9487-a558d411643f-srv-cert\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.814541 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addd872d-85f4-4a82-b643-ab0bc2c5d154-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.815066 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8e54706-270f-43ac-936d-3a00ff537e09-certs\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.819143 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-encryption-config\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.821946 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f37d7b2-67f5-492d-8419-4f47cd848151-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hlxf6\" (UID: \"7f37d7b2-67f5-492d-8419-4f47cd848151\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.821946 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3d28862-df31-4d6c-af29-5fa5b49104ae-secret-volume\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.822392 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/219bfbde-1edd-4898-988c-93697c6223f9-metrics-tls\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.823573 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.824166 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a003606-92af-4a70-aa36-04637896c343-signing-key\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.829100 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8e54706-270f-43ac-936d-3a00ff537e09-node-bootstrap-token\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.829179 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xslx\" (UniqueName: \"kubernetes.io/projected/e1081fdb-dd9a-49ac-9749-001eec6fb12d-kube-api-access-8xslx\") pod \"openshift-controller-manager-operator-756b6f6bc6-db77f\" (UID: \"e1081fdb-dd9a-49ac-9749-001eec6fb12d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.829583 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/165c4011-dd67-4dce-8cd4-63de1f286dbe-etcd-client\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.830822 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e434f3f3-87cf-420b-822e-b0691ed878fb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7pz66\" (UID: \"e434f3f3-87cf-420b-822e-b0691ed878fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.831167 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/611835ea-114d-4fca-be9a-d798ccdacdc8-proxy-tls\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.834344 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.834813 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-stats-auth\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.835298 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4551a71c-b322-4aff-9487-a558d411643f-profile-collector-cert\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.840334 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00fc488a-3478-434a-93f4-bdd59b51ecbd-proxy-tls\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.845319 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzd7\" (UniqueName: \"kubernetes.io/projected/addd872d-85f4-4a82-b643-ab0bc2c5d154-kube-api-access-qrzd7\") pod \"cluster-image-registry-operator-dc59b4c8b-9zrxr\" (UID: \"addd872d-85f4-4a82-b643-ab0bc2c5d154\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.845828 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-metrics-tls\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.846007 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4800c15-0f1c-4d15-89ec-f8d4c65fbc96-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m672w\" (UID: \"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.867394 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.876016 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc7c8\" (UniqueName: \"kubernetes.io/projected/400678dc-74d6-4b93-aa8a-7468710877d6-kube-api-access-qc7c8\") pod \"controller-manager-879f6c89f-7wzdw\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.876327 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.883683 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwxc4\" (UniqueName: \"kubernetes.io/projected/63f9da9c-779b-4d17-829b-0310c9d360b2-kube-api-access-wwxc4\") pod \"dns-operator-744455d44c-c8lcn\" (UID: \"63f9da9c-779b-4d17-829b-0310c9d360b2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.886074 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.892879 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:57 crc kubenswrapper[4852]: E1210 11:52:57.893726 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.393693907 +0000 UTC m=+64.479219131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.902000 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.931349 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.939483 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brpvq\" (UniqueName: \"kubernetes.io/projected/7f37d7b2-67f5-492d-8419-4f47cd848151-kube-api-access-brpvq\") pod \"package-server-manager-789f6589d5-hlxf6\" (UID: \"7f37d7b2-67f5-492d-8419-4f47cd848151\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.952496 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.953042 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a40023e-b291-4a4c-8f6a-6cb91bb54c30-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9g9k7\" (UID: \"6a40023e-b291-4a4c-8f6a-6cb91bb54c30\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.958637 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.966448 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.977273 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdvvs\" (UniqueName: \"kubernetes.io/projected/4551a71c-b322-4aff-9487-a558d411643f-kube-api-access-hdvvs\") pod \"catalog-operator-68c6474976-szqb4\" (UID: \"4551a71c-b322-4aff-9487-a558d411643f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.979118 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.988890 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvrmg\" (UniqueName: \"kubernetes.io/projected/8bc2ea7c-2f45-49ac-b683-c57d84d8e758-kube-api-access-lvrmg\") pod \"control-plane-machine-set-operator-78cbb6b69f-n8gzr\" (UID: \"8bc2ea7c-2f45-49ac-b683-c57d84d8e758\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" Dec 10 11:52:57 crc kubenswrapper[4852]: I1210 11:52:57.995009 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:57 crc kubenswrapper[4852]: E1210 11:52:57.995564 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.495545908 +0000 UTC m=+64.581071132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.005850 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.008921 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.024580 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.033327 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-97l7t"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.037126 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q52q\" (UniqueName: \"kubernetes.io/projected/622ed8c8-e46e-41d7-8b9b-cbc637d25dc4-kube-api-access-2q52q\") pod \"ingress-canary-cjzjt\" (UID: \"622ed8c8-e46e-41d7-8b9b-cbc637d25dc4\") " pod="openshift-ingress-canary/ingress-canary-cjzjt" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.046433 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gmm6c"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.055324 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbs9\" (UniqueName: \"kubernetes.io/projected/165c4011-dd67-4dce-8cd4-63de1f286dbe-kube-api-access-wzbs9\") pod \"apiserver-7bbb656c7d-7p8tn\" (UID: \"165c4011-dd67-4dce-8cd4-63de1f286dbe\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.065222 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ndbzv"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.073013 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdw6f\" (UniqueName: \"kubernetes.io/projected/2141a0da-f43d-4eb3-90a0-338623412c49-kube-api-access-xdw6f\") pod \"csi-hostpathplugin-2tlvx\" (UID: \"2141a0da-f43d-4eb3-90a0-338623412c49\") " pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.079440 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.083591 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.097156 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.102596 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.102946 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.602907601 +0000 UTC m=+64.688432835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.103388 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.107247 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.607197441 +0000 UTC m=+64.692722675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.110946 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99s96\" (UniqueName: \"kubernetes.io/projected/44d784f7-186c-4b34-aaf4-97fbedbbc7af-kube-api-access-99s96\") pod \"migrator-59844c95c7-xrtjj\" (UID: \"44d784f7-186c-4b34-aaf4-97fbedbbc7af\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.123159 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rs7p\" (UniqueName: \"kubernetes.io/projected/a8e54706-270f-43ac-936d-3a00ff537e09-kube-api-access-4rs7p\") pod \"machine-config-server-p7m7x\" (UID: \"a8e54706-270f-43ac-936d-3a00ff537e09\") " pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.125147 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:52:58 crc kubenswrapper[4852]: W1210 11:52:58.138658 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf52478_5cc3_48c5_9f24_fc1ad41a3601.slice/crio-2d237e6f913bda011f77fa8f285888a0774950c442810d98e7034a840b9c6878 WatchSource:0}: Error finding container 2d237e6f913bda011f77fa8f285888a0774950c442810d98e7034a840b9c6878: Status 404 returned error can't find the container with id 2d237e6f913bda011f77fa8f285888a0774950c442810d98e7034a840b9c6878 Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.144034 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq9vl\" (UniqueName: \"kubernetes.io/projected/3d331406-40f3-46fa-b660-f4cf0813d332-kube-api-access-sq9vl\") pod \"cni-sysctl-allowlist-ds-rd2bl\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.155418 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5cf\" (UniqueName: \"kubernetes.io/projected/1919e18e-d914-4ee7-8bf4-6de02e6760c2-kube-api-access-qq5cf\") pod \"marketplace-operator-79b997595-4cv5l\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.168952 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-p7m7x" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.170892 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm9vq\" (UniqueName: \"kubernetes.io/projected/0a003606-92af-4a70-aa36-04637896c343-kube-api-access-rm9vq\") pod \"service-ca-9c57cc56f-m2vtx\" (UID: \"0a003606-92af-4a70-aa36-04637896c343\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.188504 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cjzjt" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.191332 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcnhk\" (UniqueName: \"kubernetes.io/projected/611835ea-114d-4fca-be9a-d798ccdacdc8-kube-api-access-fcnhk\") pod \"machine-config-operator-74547568cd-4zg6z\" (UID: \"611835ea-114d-4fca-be9a-d798ccdacdc8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.195752 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.209806 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.210323 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.710297124 +0000 UTC m=+64.795822348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.219272 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.220067 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwh9\" (UniqueName: \"kubernetes.io/projected/00fc488a-3478-434a-93f4-bdd59b51ecbd-kube-api-access-7rwh9\") pod \"machine-config-controller-84d6567774-wcmm5\" (UID: \"00fc488a-3478-434a-93f4-bdd59b51ecbd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.246916 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rfsc\" (UniqueName: \"kubernetes.io/projected/3dc333d1-28e6-443f-b1aa-a91b83aade24-kube-api-access-9rfsc\") pod \"packageserver-d55dfcdfc-vgn4s\" (UID: \"3dc333d1-28e6-443f-b1aa-a91b83aade24\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.261315 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxh8m\" (UniqueName: \"kubernetes.io/projected/03272e1c-aff4-409d-bf82-9e9b8d03ee4e-kube-api-access-sxh8m\") pod \"router-default-5444994796-wqp6t\" (UID: \"03272e1c-aff4-409d-bf82-9e9b8d03ee4e\") " pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.279937 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwmsg\" (UniqueName: \"kubernetes.io/projected/48bf1c25-e2cd-4e12-bfc0-2d99ad091df2-kube-api-access-bwmsg\") pod \"olm-operator-6b444d44fb-zc6dh\" (UID: \"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.300649 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.305983 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npvnx\" (UniqueName: \"kubernetes.io/projected/e434f3f3-87cf-420b-822e-b0691ed878fb-kube-api-access-npvnx\") pod \"multus-admission-controller-857f4d67dd-7pz66\" (UID: \"e434f3f3-87cf-420b-822e-b0691ed878fb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.308356 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.314181 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cv46\" (UniqueName: \"kubernetes.io/projected/5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6-kube-api-access-4cv46\") pod \"ingress-operator-5b745b69d9-rnn8r\" (UID: \"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.321525 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.324266 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vlpmq"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.327893 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.328285 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.828270538 +0000 UTC m=+64.913795762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.340841 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-c82cd"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.341138 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.350367 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nm85\" (UniqueName: \"kubernetes.io/projected/a3d28862-df31-4d6c-af29-5fa5b49104ae-kube-api-access-8nm85\") pod \"collect-profiles-29422785-vrx7c\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.352747 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx9sg\" (UniqueName: \"kubernetes.io/projected/219bfbde-1edd-4898-988c-93697c6223f9-kube-api-access-zx9sg\") pod \"dns-default-99cx6\" (UID: \"219bfbde-1edd-4898-988c-93697c6223f9\") " pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.353520 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.362438 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.368087 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.372076 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.372748 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh7nc\" (UniqueName: \"kubernetes.io/projected/9916fff5-914f-439b-886d-844aa8739d83-kube-api-access-kh7nc\") pod \"service-ca-operator-777779d784-qwwc2\" (UID: \"9916fff5-914f-439b-886d-844aa8739d83\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.388946 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.412150 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.415404 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.418682 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s5kzp"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.423793 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f"] Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.432877 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.433407 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:58.933364483 +0000 UTC m=+65.018889707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.435061 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.453896 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.454801 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.473723 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-99cx6" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.535684 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.536072 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.036055536 +0000 UTC m=+65.121580760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.569726 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" event={"ID":"5bda3bbd-e919-404e-ae6f-fa2beef3f56a","Type":"ContainerStarted","Data":"4bd9651a8a6606e020c69e06512e0450e772953ea271f95c7d4233f1e75bebaa"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.571934 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" event={"ID":"e1081fdb-dd9a-49ac-9749-001eec6fb12d","Type":"ContainerStarted","Data":"83ddce6d35dac4d80014a2b526217784ba06ec3d4a2ed50eb20d65b69a55b34c"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.581003 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vlpmq" event={"ID":"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78","Type":"ContainerStarted","Data":"c996831d9e48e556f7947341f8607a86defc297e935b893eb51c39825369dc79"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.583036 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" event={"ID":"76310b37-2f80-4da3-8b7e-8dde4ce8117c","Type":"ContainerStarted","Data":"38894768e0a5539f283e85f0509bcb7337f97851b0f05577515f45e3c9a8367a"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.585275 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" event={"ID":"3d331406-40f3-46fa-b660-f4cf0813d332","Type":"ContainerStarted","Data":"366e04e47e794072b6b2620815d25ba976ea13a369a82d5c75d3606e7b80b2f7"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.586684 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ndbzv" event={"ID":"eaf52478-5cc3-48c5-9f24-fc1ad41a3601","Type":"ContainerStarted","Data":"2d237e6f913bda011f77fa8f285888a0774950c442810d98e7034a840b9c6878"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.586927 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.588488 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" event={"ID":"fedc668c-0d6e-42ef-bf40-93a595875617","Type":"ContainerStarted","Data":"70e4d172ca0d0b4b18c7086449fb528c9529a21c66dcbd90e2539c51fcbae125"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.589948 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" event={"ID":"c9c92825-dfcf-4030-8fa7-4326fc350f10","Type":"ContainerStarted","Data":"c141c4f756d429dce3b6d4cd937c4e82eef5be40d848337a8a718053f0aeea00"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.592020 4852 generic.go:334] "Generic (PLEG): container finished" podID="1cc9306f-7986-4543-b2d2-4a24fbbda5ca" containerID="b16e65287c037e52b30404e6d4196c0ee3552afc4bd07d8ef10afa882a418673" exitCode=0 Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.592105 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" event={"ID":"1cc9306f-7986-4543-b2d2-4a24fbbda5ca","Type":"ContainerDied","Data":"b16e65287c037e52b30404e6d4196c0ee3552afc4bd07d8ef10afa882a418673"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.592135 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" event={"ID":"1cc9306f-7986-4543-b2d2-4a24fbbda5ca","Type":"ContainerStarted","Data":"f2ceea829b66de53c88e84c86d896d06cbbd0dbc8088937f996a791be944f6ec"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.593587 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" event={"ID":"0523d611-7b4c-4293-b657-e076ee51aed2","Type":"ContainerStarted","Data":"6a742ab05131ff65905c72b4c2af2bc1116185bc8655e8a4f6d00a9e66b7ff2e"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.594661 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" event={"ID":"538daf76-3827-4747-bffb-1106c125238c","Type":"ContainerStarted","Data":"d8fd9440ecc0ec78c0364528df2dca0de9a59b16a1666340949bb4f8be22bc22"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.595730 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c82cd" event={"ID":"736a1895-9f79-4788-9f63-5b9b3406540d","Type":"ContainerStarted","Data":"532933a7b127e0613d8ccdd91fa264b484e95787261c27dc07c0bf2ae823eaa0"} Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.638933 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.639139 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.139073818 +0000 UTC m=+65.224599042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.639396 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.639790 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.139763895 +0000 UTC m=+65.225289119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.742907 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.744885 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.24485544 +0000 UTC m=+65.330380674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.746179 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.746886 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.246875522 +0000 UTC m=+65.332400746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.848811 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.848994 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.348956449 +0000 UTC m=+65.434481683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.849187 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.849540 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.349526744 +0000 UTC m=+65.435051968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.950739 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.951138 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.451116709 +0000 UTC m=+65.536641933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:58 crc kubenswrapper[4852]: I1210 11:52:58.951467 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:58 crc kubenswrapper[4852]: E1210 11:52:58.951741 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.451731795 +0000 UTC m=+65.537257019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.053536 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.054494 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.55446961 +0000 UTC m=+65.639994824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.164921 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.170091 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.670073432 +0000 UTC m=+65.755598656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.239284 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-srk2d"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.258679 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.266414 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c8lcn"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.266888 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.267687 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.767656615 +0000 UTC m=+65.853181839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.270726 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r7qp9"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.368498 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.369628 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.869550117 +0000 UTC m=+65.955075341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.469349 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.469622 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.969588773 +0000 UTC m=+66.055113997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.472171 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.472766 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:52:59.972736733 +0000 UTC m=+66.058262147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.574790 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.575316 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.075274653 +0000 UTC m=+66.160799877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.640101 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wqp6t" event={"ID":"03272e1c-aff4-409d-bf82-9e9b8d03ee4e","Type":"ContainerStarted","Data":"3f0b6a948a427d22a6ddfcb170a19f1de30df954385434cd8957d73c9ab47f18"} Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.640555 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wqp6t" event={"ID":"03272e1c-aff4-409d-bf82-9e9b8d03ee4e","Type":"ContainerStarted","Data":"6c5267dea31b56c06aa777a64dda1916fa8e475e840704beb702c3ad621ae209"} Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.650131 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" event={"ID":"63f9da9c-779b-4d17-829b-0310c9d360b2","Type":"ContainerStarted","Data":"66ee93e9b43766a5c7b1225c98fe0c046c53516a09df752d2acf63196898eaad"} Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.655618 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" event={"ID":"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d","Type":"ContainerStarted","Data":"187e7fad110a59cd05210e2f117d4590435afdd1e26c2cc88e16e34258dfcabe"} Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.664703 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wqp6t" podStartSLOduration=42.664678347 podStartE2EDuration="42.664678347s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:59.659486424 +0000 UTC m=+65.745011668" watchObservedRunningTime="2025-12-10 11:52:59.664678347 +0000 UTC m=+65.750203581" Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.678165 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.678585 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.178565211 +0000 UTC m=+66.264090435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.683278 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" event={"ID":"95cac14e-2b1e-4728-a2d8-8e4613a0f330","Type":"ContainerStarted","Data":"6261712bd510eaa3802f410773a42f1d2090b4a5dde504425d4f5d46744c9730"} Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.689529 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.693669 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-p7m7x" event={"ID":"a8e54706-270f-43ac-936d-3a00ff537e09","Type":"ContainerStarted","Data":"84c6196e07bc3cea25d01a6eecdb4f8b2379b7d6af4731b7d16c5a9db6192a4f"} Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.695850 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" event={"ID":"f0b90ca8-b975-49a2-9373-715ea46eeabf","Type":"ContainerStarted","Data":"b95dbf5f6ce691e79ba146ff543d099a25e8967e252ee2365f2e6d323f13a4ad"} Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.714123 4852 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-c6bdw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.714178 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" podUID="76310b37-2f80-4da3-8b7e-8dde4ce8117c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.748999 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-p7m7x" podStartSLOduration=5.74898241 podStartE2EDuration="5.74898241s" podCreationTimestamp="2025-12-10 11:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:59.74738793 +0000 UTC m=+65.832913154" watchObservedRunningTime="2025-12-10 11:52:59.74898241 +0000 UTC m=+65.834507634" Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.782290 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.783992 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.283966224 +0000 UTC m=+66.369491448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.789320 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.804407 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wzdw"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.808161 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" podStartSLOduration=41.808134941 podStartE2EDuration="41.808134941s" podCreationTimestamp="2025-12-10 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:52:59.786655113 +0000 UTC m=+65.872180347" watchObservedRunningTime="2025-12-10 11:52:59.808134941 +0000 UTC m=+65.893660165" Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.817490 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.817542 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.823632 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.871207 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.875358 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.878034 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7"] Dec 10 11:52:59 crc kubenswrapper[4852]: I1210 11:52:59.911045 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:52:59 crc kubenswrapper[4852]: E1210 11:52:59.911800 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.411784699 +0000 UTC m=+66.497309923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.013682 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.014012 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.51397571 +0000 UTC m=+66.599500944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.014883 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.016265 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.516101664 +0000 UTC m=+66.601626898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: W1210 11:53:00.055085 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165c4011_dd67_4dce_8cd4_63de1f286dbe.slice/crio-93c743fa734095441236c9317618b71231188f2e8ba5661c01c756bdf4b6177f WatchSource:0}: Error finding container 93c743fa734095441236c9317618b71231188f2e8ba5661c01c756bdf4b6177f: Status 404 returned error can't find the container with id 93c743fa734095441236c9317618b71231188f2e8ba5661c01c756bdf4b6177f Dec 10 11:53:00 crc kubenswrapper[4852]: W1210 11:53:00.062847 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4800c15_0f1c_4d15_89ec_f8d4c65fbc96.slice/crio-1a167a89373b35bac5fa5f1c785295724d5ff18f039a65e2de80ef5597284ff9 WatchSource:0}: Error finding container 1a167a89373b35bac5fa5f1c785295724d5ff18f039a65e2de80ef5597284ff9: Status 404 returned error can't find the container with id 1a167a89373b35bac5fa5f1c785295724d5ff18f039a65e2de80ef5597284ff9 Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.113214 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cjzjt"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.115892 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.116319 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.616302284 +0000 UTC m=+66.701827508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.156138 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4cv5l"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.157339 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.185179 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-99cx6"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.186472 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.190820 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.192463 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.214321 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.218056 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.218447 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.718434153 +0000 UTC m=+66.803959367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.219615 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.253953 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.279617 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.279684 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m2vtx"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.280910 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7pz66"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.312424 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.316629 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2tlvx"] Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.318860 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.319059 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.819000882 +0000 UTC m=+66.904526116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.319177 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.319719 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.8196908 +0000 UTC m=+66.905216204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.361878 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.366522 4852 patch_prober.go:28] interesting pod/router-default-5444994796-wqp6t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 10 11:53:00 crc kubenswrapper[4852]: [-]has-synced failed: reason withheld Dec 10 11:53:00 crc kubenswrapper[4852]: [+]process-running ok Dec 10 11:53:00 crc kubenswrapper[4852]: healthz check failed Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.366591 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wqp6t" podUID="03272e1c-aff4-409d-bf82-9e9b8d03ee4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.420081 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.420472 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:00.920444503 +0000 UTC m=+67.005969727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.524282 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.524889 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.024863071 +0000 UTC m=+67.110388295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.628087 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.628476 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.128392276 +0000 UTC m=+67.213917500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.630534 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.631187 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.131169717 +0000 UTC m=+67.216694941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.720031 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cjzjt" event={"ID":"622ed8c8-e46e-41d7-8b9b-cbc637d25dc4","Type":"ContainerStarted","Data":"7333b079e1e3f9a5ff48db81bcad6c0bb210e965909f2c467b0a2f62a0a62f32"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.721187 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" event={"ID":"7f37d7b2-67f5-492d-8419-4f47cd848151","Type":"ContainerStarted","Data":"44b3375a9f27e1747ebaeb6736bb6da5f00f869a030a26361bf97c8bb7879bb8"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.737059 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.737571 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.237551794 +0000 UTC m=+67.323077018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.747072 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" event={"ID":"1cc9306f-7986-4543-b2d2-4a24fbbda5ca","Type":"ContainerStarted","Data":"ba541a4c0e6d718232a3e263257f22d754bfdb220c6612c657aa9b924c2f6ef6"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.790970 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" event={"ID":"d02c986d-d2eb-4c7d-b864-ea5946ef7fba","Type":"ContainerStarted","Data":"46ee5bdf11a063290ff472a8a16238d41853178d299fea3c555e607be74e1301"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.792609 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" event={"ID":"3dc333d1-28e6-443f-b1aa-a91b83aade24","Type":"ContainerStarted","Data":"739db930a1e67502f3958a91bcb3cff6b6594d31713d1d5e4f094c77aaac58e5"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.820828 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" event={"ID":"611835ea-114d-4fca-be9a-d798ccdacdc8","Type":"ContainerStarted","Data":"6b0e9242b669e64ea7443d2ceee15ad5f47c12d2c194cbb8634304fb0f1d56b9"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.831394 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" event={"ID":"44d784f7-186c-4b34-aaf4-97fbedbbc7af","Type":"ContainerStarted","Data":"e3f120d8092196050ea943cdaaba935c39656805ef69456b6d828d1e535fa418"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.845302 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.845785 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.345768429 +0000 UTC m=+67.431293653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.856690 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" event={"ID":"a3d28862-df31-4d6c-af29-5fa5b49104ae","Type":"ContainerStarted","Data":"bf5269bcf90b1f2df75910c3eb17f1c2daadfbc6c39ce46fe3e1d6716537423a"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.875037 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" event={"ID":"00fc488a-3478-434a-93f4-bdd59b51ecbd","Type":"ContainerStarted","Data":"2a5c096ddd1bf6ccb537f76d694f1babee40125d6aa6b9c3ae744c4bc65c7abe"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.913051 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" event={"ID":"4551a71c-b322-4aff-9487-a558d411643f","Type":"ContainerStarted","Data":"cf38469d073e4ea0fbafb1478a0d22b166ce0d209cc60b3a4314e393c897ded0"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.931206 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c82cd" event={"ID":"736a1895-9f79-4788-9f63-5b9b3406540d","Type":"ContainerStarted","Data":"0ccd9f9c506f014c9b442e188cc33c8af27019ebc3e8d53ea579729aa43e2da9"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.946423 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.946695 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.446653866 +0000 UTC m=+67.532179090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.948081 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:00 crc kubenswrapper[4852]: E1210 11:53:00.954727 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.454704002 +0000 UTC m=+67.540229226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.965690 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" event={"ID":"6a40023e-b291-4a4c-8f6a-6cb91bb54c30","Type":"ContainerStarted","Data":"cbd8c8ff4013e6307a49dfbab1be26ae4d07c850de55e63d4e4419585d6cf919"} Dec 10 11:53:00 crc kubenswrapper[4852]: I1210 11:53:00.965790 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-c82cd" podStartSLOduration=43.965763874 podStartE2EDuration="43.965763874s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:00.964103022 +0000 UTC m=+67.049628266" watchObservedRunningTime="2025-12-10 11:53:00.965763874 +0000 UTC m=+67.051289098" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.000577 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" event={"ID":"e434f3f3-87cf-420b-822e-b0691ed878fb","Type":"ContainerStarted","Data":"c6f8ca35889cf653564eac1e92ee065ab8341fe0593959932b05ab76ad0e7ae0"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.008323 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" event={"ID":"400678dc-74d6-4b93-aa8a-7468710877d6","Type":"ContainerStarted","Data":"6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.008394 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" event={"ID":"400678dc-74d6-4b93-aa8a-7468710877d6","Type":"ContainerStarted","Data":"ff27122228455340acda3dbb16731b99cb2fb365d88f45d455df27e22f9fc4fc"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.010122 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.029920 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" event={"ID":"76310b37-2f80-4da3-8b7e-8dde4ce8117c","Type":"ContainerStarted","Data":"67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.036451 4852 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7wzdw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.036619 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" podUID="400678dc-74d6-4b93-aa8a-7468710877d6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.047432 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.049630 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.051366 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.55134513 +0000 UTC m=+67.636870354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.056586 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" event={"ID":"9916fff5-914f-439b-886d-844aa8739d83","Type":"ContainerStarted","Data":"403e5fc4aca414237c5270135a62dc53e42918924c214227e056ac46b2a588e9"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.066818 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" podStartSLOduration=44.066775594 podStartE2EDuration="44.066775594s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.052599352 +0000 UTC m=+67.138124596" watchObservedRunningTime="2025-12-10 11:53:01.066775594 +0000 UTC m=+67.152300828" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.071385 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" event={"ID":"1919e18e-d914-4ee7-8bf4-6de02e6760c2","Type":"ContainerStarted","Data":"c028b57be69aaf2caf0c7472f5e848ccec5b480717ad0d03260d95acb826af5c"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.088260 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" event={"ID":"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2","Type":"ContainerStarted","Data":"4da460d89c691ac145ee5798b997348e1133761e1abe3b7a4df883f6e419ed51"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.101488 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" event={"ID":"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96","Type":"ContainerStarted","Data":"1a167a89373b35bac5fa5f1c785295724d5ff18f039a65e2de80ef5597284ff9"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.116872 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" event={"ID":"e1081fdb-dd9a-49ac-9749-001eec6fb12d","Type":"ContainerStarted","Data":"780504249e062ff63cf129f13f17a80b6ca78566068a167eb0573e3317c42c3c"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.147678 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" event={"ID":"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6","Type":"ContainerStarted","Data":"d4c5071943668863df47b54b23fb4389c5354abab5620a51b27a9f0bf5969e82"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.151719 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.152136 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.652119145 +0000 UTC m=+67.737644369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.210780 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ndbzv" event={"ID":"eaf52478-5cc3-48c5-9f24-fc1ad41a3601","Type":"ContainerStarted","Data":"80ae4510e1e7b3a030fc569910f8f8817364c74b8b81e21b907b1a693d80eb92"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.239097 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" event={"ID":"165c4011-dd67-4dce-8cd4-63de1f286dbe","Type":"ContainerStarted","Data":"93c743fa734095441236c9317618b71231188f2e8ba5661c01c756bdf4b6177f"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.255579 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.255709 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-db77f" podStartSLOduration=44.25569013 podStartE2EDuration="44.25569013s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.150103843 +0000 UTC m=+67.235629067" watchObservedRunningTime="2025-12-10 11:53:01.25569013 +0000 UTC m=+67.341215354" Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.256746 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.756725957 +0000 UTC m=+67.842251181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.256991 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ndbzv" podStartSLOduration=44.256981933 podStartE2EDuration="44.256981933s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.256684726 +0000 UTC m=+67.342209950" watchObservedRunningTime="2025-12-10 11:53:01.256981933 +0000 UTC m=+67.342507157" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.263337 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" event={"ID":"fedc668c-0d6e-42ef-bf40-93a595875617","Type":"ContainerStarted","Data":"738aee9808953309d1cbb37438e10cb3fa2312a633683c8021320f6874afb7e5"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.268904 4852 generic.go:334] "Generic (PLEG): container finished" podID="5bda3bbd-e919-404e-ae6f-fa2beef3f56a" containerID="076df9654aa7ee2ff81e2b1c4517cf3c551c2080fd5cb59bb583d2c5471e3cc8" exitCode=0 Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.269032 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" event={"ID":"5bda3bbd-e919-404e-ae6f-fa2beef3f56a","Type":"ContainerDied","Data":"076df9654aa7ee2ff81e2b1c4517cf3c551c2080fd5cb59bb583d2c5471e3cc8"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.298659 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" event={"ID":"8bc2ea7c-2f45-49ac-b683-c57d84d8e758","Type":"ContainerStarted","Data":"bd834d982a67838b48caf7a7e45836595b7a9cb7c004c6cb706f93a82e72debb"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.302806 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" event={"ID":"2141a0da-f43d-4eb3-90a0-338623412c49","Type":"ContainerStarted","Data":"5fbe8726dd92b1e6b3a467758998eaa701233d5d13f67472a522876917fe1eff"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.314928 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" event={"ID":"95cac14e-2b1e-4728-a2d8-8e4613a0f330","Type":"ContainerStarted","Data":"28df7e1cec806d46b8e074654d3d09790a26741af9a5ce5d88bace6e018317d2"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.334640 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncphg" podStartSLOduration=44.334062852 podStartE2EDuration="44.334062852s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.299610832 +0000 UTC m=+67.385136056" watchObservedRunningTime="2025-12-10 11:53:01.334062852 +0000 UTC m=+67.419588106" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.349030 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-p7m7x" event={"ID":"a8e54706-270f-43ac-936d-3a00ff537e09","Type":"ContainerStarted","Data":"4617752bb881e82a8fb81da03ec1862d1ad0dd945d25d8509da41295f10149d5"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.352532 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" podStartSLOduration=44.352507264 podStartE2EDuration="44.352507264s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.330419719 +0000 UTC m=+67.415944943" watchObservedRunningTime="2025-12-10 11:53:01.352507264 +0000 UTC m=+67.438032488" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.357119 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.359367 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.859346358 +0000 UTC m=+67.944871582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.363826 4852 patch_prober.go:28] interesting pod/router-default-5444994796-wqp6t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 10 11:53:01 crc kubenswrapper[4852]: [-]has-synced failed: reason withheld Dec 10 11:53:01 crc kubenswrapper[4852]: [+]process-running ok Dec 10 11:53:01 crc kubenswrapper[4852]: healthz check failed Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.363958 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wqp6t" podUID="03272e1c-aff4-409d-bf82-9e9b8d03ee4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.372648 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" event={"ID":"18ca10bb-9e99-4051-a8d6-197657d74d3f","Type":"ContainerStarted","Data":"3b33b7ddb283566a7c64f35a3235d737876b6d49431427f8b44f7529e98ce381"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.372718 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" event={"ID":"18ca10bb-9e99-4051-a8d6-197657d74d3f","Type":"ContainerStarted","Data":"b368b9d60fa5cf6e9b69df4677a10c0a4ca69aaed4b03758b715f7d87798d72f"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.377394 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" event={"ID":"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d","Type":"ContainerStarted","Data":"48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.381441 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.381565 4852 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r7qp9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.381600 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.391255 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" event={"ID":"3d331406-40f3-46fa-b660-f4cf0813d332","Type":"ContainerStarted","Data":"d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.391995 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jzj5j" podStartSLOduration=44.391963742 podStartE2EDuration="44.391963742s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.389875748 +0000 UTC m=+67.475400972" watchObservedRunningTime="2025-12-10 11:53:01.391963742 +0000 UTC m=+67.477488976" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.392190 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.427779 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" podStartSLOduration=44.427750096 podStartE2EDuration="44.427750096s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.426770001 +0000 UTC m=+67.512295235" watchObservedRunningTime="2025-12-10 11:53:01.427750096 +0000 UTC m=+67.513275320" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.446131 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" event={"ID":"f0b90ca8-b975-49a2-9373-715ea46eeabf","Type":"ContainerStarted","Data":"d7a788fd045f52da70225186f6dd249105811a5bab9737972ace53c81e71631a"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.448612 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" podStartSLOduration=44.448601258 podStartE2EDuration="44.448601258s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.445668884 +0000 UTC m=+67.531194108" watchObservedRunningTime="2025-12-10 11:53:01.448601258 +0000 UTC m=+67.534126482" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.456331 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" event={"ID":"addd872d-85f4-4a82-b643-ab0bc2c5d154","Type":"ContainerStarted","Data":"49c0d71aca475479f46e3b1a0b17e1c6cef1c8d31351ee9c019cb49b258c4dc0"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.458597 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.459084 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.958792719 +0000 UTC m=+68.044317953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.461140 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.463779 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.466167 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podStartSLOduration=7.466152557 podStartE2EDuration="7.466152557s" podCreationTimestamp="2025-12-10 11:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.464214667 +0000 UTC m=+67.549739891" watchObservedRunningTime="2025-12-10 11:53:01.466152557 +0000 UTC m=+67.551677781" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.466857 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" event={"ID":"0523d611-7b4c-4293-b657-e076ee51aed2","Type":"ContainerStarted","Data":"4e646f8a7a41c4864636d598dbc0c37055d5aa5f393c4bb1cf0693bdad285564"} Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.466876 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:01.966855865 +0000 UTC m=+68.052381089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.508684 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" event={"ID":"c9c92825-dfcf-4030-8fa7-4326fc350f10","Type":"ContainerStarted","Data":"135ffa626bf2950b9448b74e8ed86119544a1185a14c702f2c2804f50cdbe92a"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.543536 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" event={"ID":"0a003606-92af-4a70-aa36-04637896c343","Type":"ContainerStarted","Data":"abf411c97668b483b072ed2765ac6c9f102bb8b9faac14b67d24ac1d92248d2b"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.560405 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vlpmq" event={"ID":"550d4cf0-5f6e-4fa1-94d7-1d662d7fff78","Type":"ContainerStarted","Data":"cff9525c290368d2179ddce007da3e76520a977aad6563268d1e083c59de9fef"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.560669 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.575560 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.576178 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" podStartSLOduration=44.576147957 podStartE2EDuration="44.576147957s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.572916554 +0000 UTC m=+67.658441778" watchObservedRunningTime="2025-12-10 11:53:01.576147957 +0000 UTC m=+67.661673191" Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.577565 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.077544772 +0000 UTC m=+68.163069996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.579925 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vlpmq" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.583848 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-99cx6" event={"ID":"219bfbde-1edd-4898-988c-93697c6223f9","Type":"ContainerStarted","Data":"93e6f344d52320d26ec6c06a44a44c7ed8cd44f2b8e0bfcea50c53573f2a7444"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.596262 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" event={"ID":"538daf76-3827-4747-bffb-1106c125238c","Type":"ContainerStarted","Data":"45855db84fced99ec81198a630abc5263d4925f615b63e6aa1ee30a7afe7dfd8"} Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.602796 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rd2bl"] Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.604846 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-srk2d" podStartSLOduration=44.60483443 podStartE2EDuration="44.60483443s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.602932341 +0000 UTC m=+67.688457585" watchObservedRunningTime="2025-12-10 11:53:01.60483443 +0000 UTC m=+67.690359654" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.671811 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-s5kzp" podStartSLOduration=44.67179371 podStartE2EDuration="44.67179371s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.642939643 +0000 UTC m=+67.728464867" watchObservedRunningTime="2025-12-10 11:53:01.67179371 +0000 UTC m=+67.757318934" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.673932 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wxkxt" podStartSLOduration=44.673923194 podStartE2EDuration="44.673923194s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.671904973 +0000 UTC m=+67.757430207" watchObservedRunningTime="2025-12-10 11:53:01.673923194 +0000 UTC m=+67.759448418" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.678079 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.680710 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.180692857 +0000 UTC m=+68.266218291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.781784 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.782166 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.282148549 +0000 UTC m=+68.367673773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.800431 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" podStartSLOduration=44.800413886 podStartE2EDuration="44.800413886s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.788711207 +0000 UTC m=+67.874236441" watchObservedRunningTime="2025-12-10 11:53:01.800413886 +0000 UTC m=+67.885939110" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.801690 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vlpmq" podStartSLOduration=44.801682048 podStartE2EDuration="44.801682048s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:01.72854514 +0000 UTC m=+67.814070364" watchObservedRunningTime="2025-12-10 11:53:01.801682048 +0000 UTC m=+67.887207272" Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.882925 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.883312 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.383298115 +0000 UTC m=+68.468823339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.984129 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.984394 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.484344787 +0000 UTC m=+68.569870011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:01 crc kubenswrapper[4852]: I1210 11:53:01.984698 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:01 crc kubenswrapper[4852]: E1210 11:53:01.985119 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.485103057 +0000 UTC m=+68.570628331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.088383 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.089758 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.589732812 +0000 UTC m=+68.675258036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.191049 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.191564 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.691546794 +0000 UTC m=+68.777072028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.292438 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.292653 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.792626286 +0000 UTC m=+68.878151510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.293265 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.293623 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.793606372 +0000 UTC m=+68.879131596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.361145 4852 patch_prober.go:28] interesting pod/router-default-5444994796-wqp6t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 10 11:53:02 crc kubenswrapper[4852]: [-]has-synced failed: reason withheld Dec 10 11:53:02 crc kubenswrapper[4852]: [+]process-running ok Dec 10 11:53:02 crc kubenswrapper[4852]: healthz check failed Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.361249 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wqp6t" podUID="03272e1c-aff4-409d-bf82-9e9b8d03ee4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.394275 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.394492 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.894466139 +0000 UTC m=+68.979991363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.394635 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.394995 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.894980302 +0000 UTC m=+68.980505596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.496071 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.496496 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:02.996479686 +0000 UTC m=+69.082004910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.597535 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.597941 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.097919828 +0000 UTC m=+69.183445102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.601785 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" event={"ID":"a3d28862-df31-4d6c-af29-5fa5b49104ae","Type":"ContainerStarted","Data":"6aadf508c61ddfe3c85a5a956f8a4ae044e4c217523b4ad73cc35db9b6dcdbe4"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.603617 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" event={"ID":"9916fff5-914f-439b-886d-844aa8739d83","Type":"ContainerStarted","Data":"d6cecc4ef060be02263af6ba2b94ecef966ef7941741eb96c8af57554a672ea1"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.604738 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" event={"ID":"611835ea-114d-4fca-be9a-d798ccdacdc8","Type":"ContainerStarted","Data":"e42cac9d614f33f84bb921816804f562bc90c562cc90bd866ebcf3aee51481ef"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.605952 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" event={"ID":"7f37d7b2-67f5-492d-8419-4f47cd848151","Type":"ContainerStarted","Data":"67c7fc2db70f2e858511c8ec5cf7bcf374b03bd4af72e8702c02ea01f8e5e58b"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.606805 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" event={"ID":"6a40023e-b291-4a4c-8f6a-6cb91bb54c30","Type":"ContainerStarted","Data":"db8908cc8a8a31733c19db60e1a31d2a762e0ab6ec02680530bb3e33cb88e171"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.607741 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" event={"ID":"a4800c15-0f1c-4d15-89ec-f8d4c65fbc96","Type":"ContainerStarted","Data":"ec87e5dcf39c47090ac88201bec4ed7bfb4599c01971ebe4d7f377592b3eb79c"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.609122 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" event={"ID":"1919e18e-d914-4ee7-8bf4-6de02e6760c2","Type":"ContainerStarted","Data":"e697d3d8c05addb7c0d0f2ab7820da2aa002d00d4aa5f7cccae8fee93f733f5a"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.609315 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.610719 4852 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4cv5l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.610762 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.610814 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" event={"ID":"d02c986d-d2eb-4c7d-b864-ea5946ef7fba","Type":"ContainerStarted","Data":"adfd9ab701b4b85399c369f63351c22095502155d3d0694ff7f7f9fca761292d"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.612143 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" event={"ID":"00fc488a-3478-434a-93f4-bdd59b51ecbd","Type":"ContainerStarted","Data":"98c4b8e71f5c164b7e7c4a26227bdbe11f3381cfe5cd7b92c08dce7bd8273949"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.613781 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gmm6c" event={"ID":"c9c92825-dfcf-4030-8fa7-4326fc350f10","Type":"ContainerStarted","Data":"14fe3605e67fcc43cbd55908315ceb88616273fc023a7e9d1dad02a280e1b7ee"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.614951 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldgnq" event={"ID":"18ca10bb-9e99-4051-a8d6-197657d74d3f","Type":"ContainerStarted","Data":"494d8a361188b51deaf89b8feac6f27f5d247393dc9ddf6ad3925e1902e60826"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.616142 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cjzjt" event={"ID":"622ed8c8-e46e-41d7-8b9b-cbc637d25dc4","Type":"ContainerStarted","Data":"a63cbd97db45289b28f5af54a7c1b0eb08dc63b18a71ca836f07d83cd90bf0af"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.617733 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" event={"ID":"4551a71c-b322-4aff-9487-a558d411643f","Type":"ContainerStarted","Data":"bc57c5419768233fbaae69d7137b5b249d8353ee8ba002c0c05ba5764d331a4b"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.620075 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" event={"ID":"1cc9306f-7986-4543-b2d2-4a24fbbda5ca","Type":"ContainerStarted","Data":"e3ccbd1e13ae6adbcc487f5791764756cd23b02ab1b2d486f9f85aa5152e245e"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.622173 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" event={"ID":"63f9da9c-779b-4d17-829b-0310c9d360b2","Type":"ContainerStarted","Data":"efa84178ab6b3fc8d0c3fd107976fe04fc022cf0deb94cfff17e73667e43604d"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.623713 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n8gzr" event={"ID":"8bc2ea7c-2f45-49ac-b683-c57d84d8e758","Type":"ContainerStarted","Data":"728e1116d50f9d27fcd3e960c97d3b620cb6dd994348c72574f6a9173c1f4b65"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.626258 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" event={"ID":"3dc333d1-28e6-443f-b1aa-a91b83aade24","Type":"ContainerStarted","Data":"3b5a741f262db9f6d365edcbf4c677569ced54aa63ab9a8c0036565709000493"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.627639 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" event={"ID":"addd872d-85f4-4a82-b643-ab0bc2c5d154","Type":"ContainerStarted","Data":"93c9e187c3437199e3c636af56b665cdbff27e142547d578004f00f0e3a0476a"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.629539 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" event={"ID":"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6","Type":"ContainerStarted","Data":"bf80c03716cdd62bab00c0da7895ac1751fd075ef84639a793a5e6047c076c03"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.633021 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podStartSLOduration=45.632999465 podStartE2EDuration="45.632999465s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:02.629801372 +0000 UTC m=+68.715326596" watchObservedRunningTime="2025-12-10 11:53:02.632999465 +0000 UTC m=+68.718524799" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.638911 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" event={"ID":"44d784f7-186c-4b34-aaf4-97fbedbbc7af","Type":"ContainerStarted","Data":"bbc4f475b5471f326ebdedf426ae9aeb8fba0c02d2a5abba84a15a0e724f2311"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.644486 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" event={"ID":"e434f3f3-87cf-420b-822e-b0691ed878fb","Type":"ContainerStarted","Data":"0237b20a5499071f7263b5a7f51d2cec24e106a8a608a623739158316480e14a"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.646457 4852 generic.go:334] "Generic (PLEG): container finished" podID="165c4011-dd67-4dce-8cd4-63de1f286dbe" containerID="cb4ad92f350e5e7e5938632edd4000e247bce3e09dc130842be88cbcc3b8b794" exitCode=0 Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.646546 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" event={"ID":"165c4011-dd67-4dce-8cd4-63de1f286dbe","Type":"ContainerDied","Data":"cb4ad92f350e5e7e5938632edd4000e247bce3e09dc130842be88cbcc3b8b794"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.647508 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgvrl" podStartSLOduration=45.647495454 podStartE2EDuration="45.647495454s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:02.647200696 +0000 UTC m=+68.732725940" watchObservedRunningTime="2025-12-10 11:53:02.647495454 +0000 UTC m=+68.733020688" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.652662 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-99cx6" event={"ID":"219bfbde-1edd-4898-988c-93697c6223f9","Type":"ContainerStarted","Data":"1511116de107d0ea361c004525114b277ccef596ff3cce10a5fe4e6defdf49f2"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.664252 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" event={"ID":"5bda3bbd-e919-404e-ae6f-fa2beef3f56a","Type":"ContainerStarted","Data":"4220273e60a2177f3941dc277abb02e1771d615e34d696a1c74acc365c0aa768"} Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.670578 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.670680 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.670709 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.692426 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" podStartSLOduration=45.692406137 podStartE2EDuration="45.692406137s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:02.690931139 +0000 UTC m=+68.776456353" watchObservedRunningTime="2025-12-10 11:53:02.692406137 +0000 UTC m=+68.777931381" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.699571 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.702346 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.202314296 +0000 UTC m=+69.287839570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.808159 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.809810 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.309794326 +0000 UTC m=+69.395319550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.840800 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.879339 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cjzjt" podStartSLOduration=8.879314374 podStartE2EDuration="8.879314374s" podCreationTimestamp="2025-12-10 11:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:02.776896376 +0000 UTC m=+68.862421620" watchObservedRunningTime="2025-12-10 11:53:02.879314374 +0000 UTC m=+68.964839598" Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.912838 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.913045 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.413023555 +0000 UTC m=+69.498548779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:02 crc kubenswrapper[4852]: I1210 11:53:02.913177 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:02 crc kubenswrapper[4852]: E1210 11:53:02.913604 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.41359166 +0000 UTC m=+69.499116884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.011771 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.016816 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.017346 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.517329362 +0000 UTC m=+69.602854586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.121987 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.122436 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.62242031 +0000 UTC m=+69.707945534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.228609 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.228871 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.728836762 +0000 UTC m=+69.814361986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.229503 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.229951 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.72992862 +0000 UTC m=+69.815454034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.340116 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.340595 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.840575383 +0000 UTC m=+69.926100607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.370099 4852 patch_prober.go:28] interesting pod/router-default-5444994796-wqp6t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 10 11:53:03 crc kubenswrapper[4852]: [-]has-synced failed: reason withheld Dec 10 11:53:03 crc kubenswrapper[4852]: [+]process-running ok Dec 10 11:53:03 crc kubenswrapper[4852]: healthz check failed Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.370171 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wqp6t" podUID="03272e1c-aff4-409d-bf82-9e9b8d03ee4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.441394 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.450340 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:03.950306312 +0000 UTC m=+70.035831536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.542950 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.543591 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.04356828 +0000 UTC m=+70.129093504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.644706 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.644757 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.645184 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.145163117 +0000 UTC m=+70.230688421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.668045 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4917776-2f46-46af-bd13-db5745bfdbf0-metrics-certs\") pod \"network-metrics-daemon-bjxbn\" (UID: \"d4917776-2f46-46af-bd13-db5745bfdbf0\") " pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.687079 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" event={"ID":"7f37d7b2-67f5-492d-8419-4f47cd848151","Type":"ContainerStarted","Data":"11fe7721a0498a7f9636f28c276b9b769fbd505397ec62bd8750ad1059f58d5b"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.688596 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.698601 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" event={"ID":"63f9da9c-779b-4d17-829b-0310c9d360b2","Type":"ContainerStarted","Data":"f7e6328e2ab8af22070257c7be4ffbbd953d1e2745163dad9653a787ecb52723"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.700302 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" event={"ID":"2141a0da-f43d-4eb3-90a0-338623412c49","Type":"ContainerStarted","Data":"3b43ce3a876c1816bbb5744288671793c84bb00d56da132868e9d68172cc4d29"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.714130 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjxbn" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.716983 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" event={"ID":"5f2540e3-b03b-4fd2-b6d9-0ea08b54e4e6","Type":"ContainerStarted","Data":"97767a03fbbabe5d8ee5d8e0f8e252313d0d321ca27592fdff3aacc111c51f2b"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.740224 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" event={"ID":"00fc488a-3478-434a-93f4-bdd59b51ecbd","Type":"ContainerStarted","Data":"d02fdb2f0f39abe38e9320a47bc699453a673deacd5550e26e350eef30c7fe2b"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.746446 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.747904 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.247883332 +0000 UTC m=+70.333408566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.765599 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" event={"ID":"48bf1c25-e2cd-4e12-bfc0-2d99ad091df2","Type":"ContainerStarted","Data":"1f32055f5635bea611cf0300a8aded330c22230e0955954cabb113133683c192"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.765984 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.769421 4852 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zc6dh container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.769479 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" podUID="48bf1c25-e2cd-4e12-bfc0-2d99ad091df2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.775174 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" event={"ID":"0a003606-92af-4a70-aa36-04637896c343","Type":"ContainerStarted","Data":"cfd8067da9d356c39c5b38e507ceb515d3316c703b22905f2de539331db09753"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.777384 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" event={"ID":"611835ea-114d-4fca-be9a-d798ccdacdc8","Type":"ContainerStarted","Data":"2a5bdd3df78daa11ebe4bcd67de813f8f2c120cbbf85b493a54fabf5d3617471"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.799214 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" event={"ID":"44d784f7-186c-4b34-aaf4-97fbedbbc7af","Type":"ContainerStarted","Data":"67547c011f7c70e418a20f6681d566c326a8c0fcf0a44b76716e7e332da7027f"} Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.799284 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.803445 4852 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4cv5l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.803513 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.804883 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.804960 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.807464 4852 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-szqb4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.807503 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" podUID="4551a71c-b322-4aff-9487-a558d411643f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.824184 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" gracePeriod=30 Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.824516 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.867950 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.882021 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" podStartSLOduration=46.882003308 podStartE2EDuration="46.882003308s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:03.798105475 +0000 UTC m=+69.883630699" watchObservedRunningTime="2025-12-10 11:53:03.882003308 +0000 UTC m=+69.967528552" Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.883780 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.383765935 +0000 UTC m=+70.469291149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.957885 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wcmm5" podStartSLOduration=46.957862182 podStartE2EDuration="46.957862182s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:03.883365524 +0000 UTC m=+69.968890768" watchObservedRunningTime="2025-12-10 11:53:03.957862182 +0000 UTC m=+70.043387406" Dec 10 11:53:03 crc kubenswrapper[4852]: I1210 11:53:03.969444 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:03 crc kubenswrapper[4852]: E1210 11:53:03.970059 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.47003806 +0000 UTC m=+70.555563294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.071177 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.071564 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.571551874 +0000 UTC m=+70.657077098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.108786 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zg6z" podStartSLOduration=47.108762527 podStartE2EDuration="47.108762527s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:03.959298069 +0000 UTC m=+70.044823293" watchObservedRunningTime="2025-12-10 11:53:04.108762527 +0000 UTC m=+70.194287751" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.174749 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.175092 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.67507542 +0000 UTC m=+70.760600644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.268065 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-m2vtx" podStartSLOduration=46.268049471 podStartE2EDuration="46.268049471s" podCreationTimestamp="2025-12-10 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.116612212 +0000 UTC m=+70.202137436" watchObservedRunningTime="2025-12-10 11:53:04.268049471 +0000 UTC m=+70.353574685" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.275896 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.276205 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.776192484 +0000 UTC m=+70.861717708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.358852 4852 patch_prober.go:28] interesting pod/router-default-5444994796-wqp6t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 10 11:53:04 crc kubenswrapper[4852]: [-]has-synced failed: reason withheld Dec 10 11:53:04 crc kubenswrapper[4852]: [+]process-running ok Dec 10 11:53:04 crc kubenswrapper[4852]: healthz check failed Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.358911 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wqp6t" podUID="03272e1c-aff4-409d-bf82-9e9b8d03ee4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.359914 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" podStartSLOduration=47.359889102 podStartE2EDuration="47.359889102s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.269010826 +0000 UTC m=+70.354536060" watchObservedRunningTime="2025-12-10 11:53:04.359889102 +0000 UTC m=+70.445414326" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.376615 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.376763 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.876737503 +0000 UTC m=+70.962262737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.376909 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.377446 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.877423721 +0000 UTC m=+70.962948965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.477877 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.478794 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:04.97877238 +0000 UTC m=+71.064297604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.492505 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" podStartSLOduration=47.492479029 podStartE2EDuration="47.492479029s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.368495917 +0000 UTC m=+70.454021161" watchObservedRunningTime="2025-12-10 11:53:04.492479029 +0000 UTC m=+70.578004253" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.492727 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rnn8r" podStartSLOduration=47.492720685 podStartE2EDuration="47.492720685s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.468655186 +0000 UTC m=+70.554180430" watchObservedRunningTime="2025-12-10 11:53:04.492720685 +0000 UTC m=+70.578245909" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.588094 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.588410 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.088396926 +0000 UTC m=+71.173922150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.638453 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c8lcn" podStartSLOduration=47.638434174 podStartE2EDuration="47.638434174s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.565438116 +0000 UTC m=+70.650963350" watchObservedRunningTime="2025-12-10 11:53:04.638434174 +0000 UTC m=+70.723959398" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.698725 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.699553 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.199532982 +0000 UTC m=+71.285058206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.742846 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m672w" podStartSLOduration=47.742790563 podStartE2EDuration="47.742790563s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.710479088 +0000 UTC m=+70.796004322" watchObservedRunningTime="2025-12-10 11:53:04.742790563 +0000 UTC m=+70.828315787" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.803862 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.807977 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.307955506 +0000 UTC m=+71.393480730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.843721 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" event={"ID":"165c4011-dd67-4dce-8cd4-63de1f286dbe","Type":"ContainerStarted","Data":"2240f3f8208146a72bb5d22265c1d5b88de426f5c9b6472b3fb6febd8f883a31"} Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.843731 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qwwc2" podStartSLOduration=46.843710091 podStartE2EDuration="46.843710091s" podCreationTimestamp="2025-12-10 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.807644048 +0000 UTC m=+70.893169272" watchObservedRunningTime="2025-12-10 11:53:04.843710091 +0000 UTC m=+70.929235315" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.856883 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-99cx6" event={"ID":"219bfbde-1edd-4898-988c-93697c6223f9","Type":"ContainerStarted","Data":"4a0993ee8a19d0bbc588930e68671b1db1565add32abe97b750de3deae76e0e4"} Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.858032 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-99cx6" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.873290 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" event={"ID":"e434f3f3-87cf-420b-822e-b0691ed878fb","Type":"ContainerStarted","Data":"024be52296955540bb50e19703f16182c7ee8bc2ee7034c14cceb81970f74542"} Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.904739 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zc6dh" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.904897 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-szqb4" Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.916102 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:04 crc kubenswrapper[4852]: E1210 11:53:04.918921 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.418898437 +0000 UTC m=+71.504423671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:04 crc kubenswrapper[4852]: I1210 11:53:04.954978 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9g9k7" podStartSLOduration=47.9549607 podStartE2EDuration="47.9549607s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.896294476 +0000 UTC m=+70.981819710" watchObservedRunningTime="2025-12-10 11:53:04.9549607 +0000 UTC m=+71.040485924" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.022984 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.023393 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.523374278 +0000 UTC m=+71.608899502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.059511 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" podStartSLOduration=48.059493583 podStartE2EDuration="48.059493583s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:05.0586145 +0000 UTC m=+71.144139724" watchObservedRunningTime="2025-12-10 11:53:05.059493583 +0000 UTC m=+71.145018807" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.060260 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xrtjj" podStartSLOduration=48.060252572 podStartE2EDuration="48.060252572s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:04.957738552 +0000 UTC m=+71.043263786" watchObservedRunningTime="2025-12-10 11:53:05.060252572 +0000 UTC m=+71.145777796" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.110224 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" podStartSLOduration=48.110204108 podStartE2EDuration="48.110204108s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:05.109957482 +0000 UTC m=+71.195482706" watchObservedRunningTime="2025-12-10 11:53:05.110204108 +0000 UTC m=+71.195729332" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.124485 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.125812 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.625767175 +0000 UTC m=+71.711292529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.148986 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" podStartSLOduration=48.148953501 podStartE2EDuration="48.148953501s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:05.144954707 +0000 UTC m=+71.230479951" watchObservedRunningTime="2025-12-10 11:53:05.148953501 +0000 UTC m=+71.234478725" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.226138 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.226575 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.72656038 +0000 UTC m=+71.812085594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.244245 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7pz66" podStartSLOduration=48.244198551 podStartE2EDuration="48.244198551s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:05.203335543 +0000 UTC m=+71.288860777" watchObservedRunningTime="2025-12-10 11:53:05.244198551 +0000 UTC m=+71.329723785" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.328018 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.328331 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.82829432 +0000 UTC m=+71.913819544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.328618 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.329104 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.829096761 +0000 UTC m=+71.914621985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.349488 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bjxbn"] Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.360331 4852 patch_prober.go:28] interesting pod/router-default-5444994796-wqp6t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 10 11:53:05 crc kubenswrapper[4852]: [-]has-synced failed: reason withheld Dec 10 11:53:05 crc kubenswrapper[4852]: [+]process-running ok Dec 10 11:53:05 crc kubenswrapper[4852]: healthz check failed Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.360405 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wqp6t" podUID="03272e1c-aff4-409d-bf82-9e9b8d03ee4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:05 crc kubenswrapper[4852]: W1210 11:53:05.361730 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4917776_2f46_46af_bd13_db5745bfdbf0.slice/crio-0a69a47653b8bb03ceffd4d5976a663efd2711f2c91c3729a19972d9fdfca63b WatchSource:0}: Error finding container 0a69a47653b8bb03ceffd4d5976a663efd2711f2c91c3729a19972d9fdfca63b: Status 404 returned error can't find the container with id 0a69a47653b8bb03ceffd4d5976a663efd2711f2c91c3729a19972d9fdfca63b Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.383492 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-99cx6" podStartSLOduration=11.383463602 podStartE2EDuration="11.383463602s" podCreationTimestamp="2025-12-10 11:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:05.38106651 +0000 UTC m=+71.466591754" watchObservedRunningTime="2025-12-10 11:53:05.383463602 +0000 UTC m=+71.468988826" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.384383 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" podStartSLOduration=48.384376766 podStartE2EDuration="48.384376766s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:05.352599406 +0000 UTC m=+71.438124630" watchObservedRunningTime="2025-12-10 11:53:05.384376766 +0000 UTC m=+71.469901990" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.429059 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.429391 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:05.929291461 +0000 UTC m=+72.014816685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.531260 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.531661 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.031645467 +0000 UTC m=+72.117170691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.594468 4852 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-97l7t container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.594542 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" podUID="5bda3bbd-e919-404e-ae6f-fa2beef3f56a" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.594954 4852 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-97l7t container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.594981 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" podUID="5bda3bbd-e919-404e-ae6f-fa2beef3f56a" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.635988 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.636337 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.136291082 +0000 UTC m=+72.221816316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.636602 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.637212 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.137198536 +0000 UTC m=+72.222723760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.737796 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.738289 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.238269088 +0000 UTC m=+72.323794322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.842073 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.842495 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.342480183 +0000 UTC m=+72.428005407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.882775 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bjxbn" event={"ID":"d4917776-2f46-46af-bd13-db5745bfdbf0","Type":"ContainerStarted","Data":"0a69a47653b8bb03ceffd4d5976a663efd2711f2c91c3729a19972d9fdfca63b"} Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.942816 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.943021 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.442993781 +0000 UTC m=+72.528518995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:05 crc kubenswrapper[4852]: I1210 11:53:05.943322 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:05 crc kubenswrapper[4852]: E1210 11:53:05.943647 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.443631688 +0000 UTC m=+72.529156912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.013867 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-662t9"] Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.014945 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.017485 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.044353 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.045298 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.545279375 +0000 UTC m=+72.630804609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.061985 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-662t9"] Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.146432 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-catalog-content\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.146520 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.146584 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-utilities\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.146632 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rq2c\" (UniqueName: \"kubernetes.io/projected/2f614760-033c-494e-81d4-11c997e0db34-kube-api-access-2rq2c\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.147011 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.646997364 +0000 UTC m=+72.732522588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.164427 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xmxbw"] Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.165437 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.176663 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.247136 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.247311 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-utilities\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.247344 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-catalog-content\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.247363 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xfrf\" (UniqueName: \"kubernetes.io/projected/b65cc728-9de3-466f-902b-47f30708118c-kube-api-access-5xfrf\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.247416 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-catalog-content\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.247475 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-utilities\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.247516 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rq2c\" (UniqueName: \"kubernetes.io/projected/2f614760-033c-494e-81d4-11c997e0db34-kube-api-access-2rq2c\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.247837 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.74782337 +0000 UTC m=+72.833348594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.248269 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-catalog-content\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.248479 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-utilities\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.298646 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rq2c\" (UniqueName: \"kubernetes.io/projected/2f614760-033c-494e-81d4-11c997e0db34-kube-api-access-2rq2c\") pod \"certified-operators-662t9\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.341196 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.348843 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-utilities\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.348900 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xfrf\" (UniqueName: \"kubernetes.io/projected/b65cc728-9de3-466f-902b-47f30708118c-kube-api-access-5xfrf\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.348959 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-catalog-content\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.348985 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.349310 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.849296282 +0000 UTC m=+72.934821506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.349528 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-catalog-content\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.349841 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-utilities\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.360448 4852 patch_prober.go:28] interesting pod/router-default-5444994796-wqp6t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 10 11:53:06 crc kubenswrapper[4852]: [-]has-synced failed: reason withheld Dec 10 11:53:06 crc kubenswrapper[4852]: [+]process-running ok Dec 10 11:53:06 crc kubenswrapper[4852]: healthz check failed Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.360516 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wqp6t" podUID="03272e1c-aff4-409d-bf82-9e9b8d03ee4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.404448 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xfrf\" (UniqueName: \"kubernetes.io/projected/b65cc728-9de3-466f-902b-47f30708118c-kube-api-access-5xfrf\") pod \"community-operators-xmxbw\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.416206 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dn8dv"] Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.417572 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.439114 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.439152 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.449814 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.450355 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:06.950337924 +0000 UTC m=+73.035863148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.465634 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.485540 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.538837 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h2rb2"] Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.539799 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.552896 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-utilities\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.553057 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tswc\" (UniqueName: \"kubernetes.io/projected/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-kube-api-access-5tswc\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.553081 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-catalog-content\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.553109 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.554322 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.054308002 +0000 UTC m=+73.139833326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.564528 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.564580 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.564824 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.564868 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.615094 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.615125 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.624691 4852 patch_prober.go:28] interesting pod/console-f9d7485db-c82cd container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.625098 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-c82cd" podUID="736a1895-9f79-4788-9f63-5b9b3406540d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.661160 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.661619 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tswc\" (UniqueName: \"kubernetes.io/projected/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-kube-api-access-5tswc\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.661671 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-catalog-content\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.661745 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-catalog-content\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.661789 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-utilities\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.661842 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-utilities\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.661916 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jk2g\" (UniqueName: \"kubernetes.io/projected/4aa25f22-5823-46d9-ae2b-a507642dc0df-kube-api-access-8jk2g\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.662091 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.162067429 +0000 UTC m=+73.247592653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.662982 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-utilities\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.713153 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tswc\" (UniqueName: \"kubernetes.io/projected/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-kube-api-access-5tswc\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.763149 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.763212 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-catalog-content\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.763274 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-utilities\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.763349 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jk2g\" (UniqueName: \"kubernetes.io/projected/4aa25f22-5823-46d9-ae2b-a507642dc0df-kube-api-access-8jk2g\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.763547 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.263531632 +0000 UTC m=+73.349056856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.764173 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-catalog-content\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.764456 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-utilities\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.802628 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jk2g\" (UniqueName: \"kubernetes.io/projected/4aa25f22-5823-46d9-ae2b-a507642dc0df-kube-api-access-8jk2g\") pod \"community-operators-h2rb2\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.864978 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.865436 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.365406245 +0000 UTC m=+73.450931469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.878272 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.900634 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7l2fq" Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.905964 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-662t9"] Dec 10 11:53:06 crc kubenswrapper[4852]: I1210 11:53:06.967287 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:06 crc kubenswrapper[4852]: E1210 11:53:06.967735 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.4677208 +0000 UTC m=+73.553246024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.068509 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.069070 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.569023749 +0000 UTC m=+73.654548973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.075708 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-catalog-content\") pod \"certified-operators-dn8dv\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.136362 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmxbw"] Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.169942 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.170357 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.670344238 +0000 UTC m=+73.755869462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.177445 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dn8dv"] Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.197529 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h2rb2"] Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.271254 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.271429 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.77139293 +0000 UTC m=+73.856918164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.271662 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.272117 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.772107468 +0000 UTC m=+73.857632852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.339576 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.367849 4852 patch_prober.go:28] interesting pod/router-default-5444994796-wqp6t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 10 11:53:07 crc kubenswrapper[4852]: [-]has-synced failed: reason withheld Dec 10 11:53:07 crc kubenswrapper[4852]: [+]process-running ok Dec 10 11:53:07 crc kubenswrapper[4852]: healthz check failed Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.368472 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wqp6t" podUID="03272e1c-aff4-409d-bf82-9e9b8d03ee4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.369705 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.370506 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.374000 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.374417 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.874398233 +0000 UTC m=+73.959923457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.384928 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.385816 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.389775 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.481547 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.481640 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.481716 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.482066 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:07.982050967 +0000 UTC m=+74.067576191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.587813 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.588317 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.588381 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.588810 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.088793888 +0000 UTC m=+74.174319112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.588839 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.631268 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.663457 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmxbw"] Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.704649 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.705075 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.205052807 +0000 UTC m=+74.290578211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.720679 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.807993 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.808406 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.308388169 +0000 UTC m=+74.393913383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.909193 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:07 crc kubenswrapper[4852]: E1210 11:53:07.909545 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.409530633 +0000 UTC m=+74.495055857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.910707 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-662t9" event={"ID":"2f614760-033c-494e-81d4-11c997e0db34","Type":"ContainerStarted","Data":"27cbed4edd4bc3f76279910fe8eaa7b6593ee99b2f5f377f907f1a84db5fa840"} Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.925900 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmxbw" event={"ID":"b65cc728-9de3-466f-902b-47f30708118c","Type":"ContainerStarted","Data":"c9bd79f657f60ab01cee544f42e8bf701c29650405a4715d2b02bcdd9d73406b"} Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.932475 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mqmsb"] Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.943595 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.948339 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.982854 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqmsb"] Dec 10 11:53:07 crc kubenswrapper[4852]: I1210 11:53:07.983269 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h2rb2"] Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.012622 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.014355 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.514336603 +0000 UTC m=+74.599861827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.015749 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-catalog-content\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.015891 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-utilities\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.016991 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29h7b\" (UniqueName: \"kubernetes.io/projected/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-kube-api-access-29h7b\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.017241 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.018701 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.518688347 +0000 UTC m=+74.604213571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.066958 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.085629 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.085663 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.124250 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.124391 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29h7b\" (UniqueName: \"kubernetes.io/projected/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-kube-api-access-29h7b\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.124544 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-catalog-content\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.124567 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-utilities\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.124898 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-utilities\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.124967 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.624951735 +0000 UTC m=+74.710476959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.126219 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-catalog-content\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.141416 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dn8dv"] Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.187984 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29h7b\" (UniqueName: \"kubernetes.io/projected/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-kube-api-access-29h7b\") pod \"redhat-marketplace-mqmsb\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.228355 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.236339 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.236666 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.736654075 +0000 UTC m=+74.822179289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.261450 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.273795 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.311440 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.311542 4852 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.337608 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.338553 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.838530499 +0000 UTC m=+74.924055723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.355928 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.388497 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.394695 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nczw8"] Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.404910 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.417797 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.421549 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nczw8"] Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.443509 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8vfs\" (UniqueName: \"kubernetes.io/projected/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-kube-api-access-j8vfs\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.443571 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.443661 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-utilities\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.443680 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-catalog-content\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.445032 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:08.945019483 +0000 UTC m=+75.030544707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.467278 4852 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4cv5l container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.467322 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.467389 4852 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4cv5l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.467413 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.544968 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.545291 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-utilities\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.545319 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-catalog-content\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.545396 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8vfs\" (UniqueName: \"kubernetes.io/projected/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-kube-api-access-j8vfs\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.545661 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.045581212 +0000 UTC m=+75.131106446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.546256 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-utilities\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.546776 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-catalog-content\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.605758 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8vfs\" (UniqueName: \"kubernetes.io/projected/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-kube-api-access-j8vfs\") pod \"redhat-marketplace-nczw8\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.631383 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-97l7t" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.644459 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.648895 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.649201 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.149186811 +0000 UTC m=+75.234712035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.655070 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vgn4s" Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.757188 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.757361 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.257341808 +0000 UTC m=+75.342867032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.757784 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.758106 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.258097348 +0000 UTC m=+75.343622572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.859953 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.860460 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.360431023 +0000 UTC m=+75.445956258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:08 crc kubenswrapper[4852]: I1210 11:53:08.977503 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:08 crc kubenswrapper[4852]: E1210 11:53:08.978530 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.47850217 +0000 UTC m=+75.564027394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.045584 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn8dv" event={"ID":"eb61d8f4-66d0-4d11-955f-4984ab5e18e6","Type":"ContainerStarted","Data":"b9cf74512fece24b038c383a1d3321d5b2850737f1db9588a4b668d11df29064"} Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.062887 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bjxbn" event={"ID":"d4917776-2f46-46af-bd13-db5745bfdbf0","Type":"ContainerStarted","Data":"2858326cf3ca25e940499c5d278fe260ebb760f1961c16c92649ffb542392549"} Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.071208 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" event={"ID":"2141a0da-f43d-4eb3-90a0-338623412c49","Type":"ContainerStarted","Data":"c54ebffbb289b9491a00597a08137476d8b97d18652c5acb36eedd07e631a78a"} Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.072631 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81","Type":"ContainerStarted","Data":"0336674725cf286d5c73beff413e2d5b35f5e88d534590b6eba1b2487ecc806e"} Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.079545 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.087513 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.587480189 +0000 UTC m=+75.673005583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.118134 4852 generic.go:334] "Generic (PLEG): container finished" podID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerID="ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549" exitCode=0 Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.118470 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2rb2" event={"ID":"4aa25f22-5823-46d9-ae2b-a507642dc0df","Type":"ContainerDied","Data":"ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549"} Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.118581 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2rb2" event={"ID":"4aa25f22-5823-46d9-ae2b-a507642dc0df","Type":"ContainerStarted","Data":"b3c9816b0ab42bff7a65915e97aaefdcd129b09b17a5413f566186ad1f8a7f29"} Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.121710 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.175061 4852 generic.go:334] "Generic (PLEG): container finished" podID="2f614760-033c-494e-81d4-11c997e0db34" containerID="f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a" exitCode=0 Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.175610 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-662t9" event={"ID":"2f614760-033c-494e-81d4-11c997e0db34","Type":"ContainerDied","Data":"f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a"} Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.189939 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.190534 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.690514643 +0000 UTC m=+75.776039867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.199863 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cn5tj"] Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.222478 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.232983 4852 generic.go:334] "Generic (PLEG): container finished" podID="b65cc728-9de3-466f-902b-47f30708118c" containerID="43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a" exitCode=0 Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.235940 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.241263 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmxbw" event={"ID":"b65cc728-9de3-466f-902b-47f30708118c","Type":"ContainerDied","Data":"43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a"} Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.247028 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wqp6t" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.250367 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn5tj"] Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.291303 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.291820 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-utilities\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.291851 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-catalog-content\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.291904 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5p8\" (UniqueName: \"kubernetes.io/projected/031e1f57-c87c-4d8f-a05a-380efb0979ec-kube-api-access-hr5p8\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.293357 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.793336771 +0000 UTC m=+75.878861995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.320622 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqmsb"] Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.392826 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-utilities\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.392871 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-catalog-content\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.392949 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5p8\" (UniqueName: \"kubernetes.io/projected/031e1f57-c87c-4d8f-a05a-380efb0979ec-kube-api-access-hr5p8\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.393052 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.394696 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-utilities\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.398556 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.898534432 +0000 UTC m=+75.984059656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.400562 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-catalog-content\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.436264 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5p8\" (UniqueName: \"kubernetes.io/projected/031e1f57-c87c-4d8f-a05a-380efb0979ec-kube-api-access-hr5p8\") pod \"redhat-operators-cn5tj\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: W1210 11:53:09.445214 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa4dd551_8252_43a4_b1b3_d4daf088ddd5.slice/crio-c59853b6a0f45aeca71e3ccfd259fb6b8e366b07e7be2bcc8c23c9a0538ed0ea WatchSource:0}: Error finding container c59853b6a0f45aeca71e3ccfd259fb6b8e366b07e7be2bcc8c23c9a0538ed0ea: Status 404 returned error can't find the container with id c59853b6a0f45aeca71e3ccfd259fb6b8e366b07e7be2bcc8c23c9a0538ed0ea Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.494494 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.495536 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:09.995512367 +0000 UTC m=+76.081037591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.549312 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6d5jr"] Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.552103 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.555313 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6d5jr"] Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.597550 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.597620 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-utilities\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.597721 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-catalog-content\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.597937 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smxxt\" (UniqueName: \"kubernetes.io/projected/0161f217-65f3-4afe-8037-281871787a8b-kube-api-access-smxxt\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.597971 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.097951425 +0000 UTC m=+76.183476649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.663621 4852 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.685793 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.692880 4852 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-7p8tn container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]log ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]etcd ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]etcd-readiness ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 10 11:53:09 crc kubenswrapper[4852]: [-]informer-sync failed: reason withheld Dec 10 11:53:09 crc kubenswrapper[4852]: [+]poststarthook/generic-apiserver-start-informers ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]poststarthook/max-in-flight-filter ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 10 11:53:09 crc kubenswrapper[4852]: [+]shutdown ok Dec 10 11:53:09 crc kubenswrapper[4852]: readyz check failed Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.692946 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" podUID="165c4011-dd67-4dce-8cd4-63de1f286dbe" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.698440 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.698646 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smxxt\" (UniqueName: \"kubernetes.io/projected/0161f217-65f3-4afe-8037-281871787a8b-kube-api-access-smxxt\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.698789 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.198750411 +0000 UTC m=+76.284275635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.698912 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-utilities\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.699340 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-catalog-content\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.699386 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-utilities\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.700006 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-catalog-content\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.700707 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.702593 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nczw8"] Dec 10 11:53:09 crc kubenswrapper[4852]: W1210 11:53:09.731931 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ee2051_d987_4dea_abca_3fcfbdb63ac5.slice/crio-e78145b8910729a13503ca87ce7bdeb2c39020f0957de0b6e2869c473edd1eda WatchSource:0}: Error finding container e78145b8910729a13503ca87ce7bdeb2c39020f0957de0b6e2869c473edd1eda: Status 404 returned error can't find the container with id e78145b8910729a13503ca87ce7bdeb2c39020f0957de0b6e2869c473edd1eda Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.754433 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smxxt\" (UniqueName: \"kubernetes.io/projected/0161f217-65f3-4afe-8037-281871787a8b-kube-api-access-smxxt\") pod \"redhat-operators-6d5jr\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.800789 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.801917 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.301899677 +0000 UTC m=+76.387425001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.884681 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.901881 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:09 crc kubenswrapper[4852]: E1210 11:53:09.902349 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.402331302 +0000 UTC m=+76.487856526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:09 crc kubenswrapper[4852]: I1210 11:53:09.994197 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn5tj"] Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.003434 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:10 crc kubenswrapper[4852]: E1210 11:53:10.003816 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.503802165 +0000 UTC m=+76.589327389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:10 crc kubenswrapper[4852]: W1210 11:53:10.029328 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod031e1f57_c87c_4d8f_a05a_380efb0979ec.slice/crio-776a900a85ae063cd3fd29f8eda40b793807679965bfb12afab3cce63fbb3874 WatchSource:0}: Error finding container 776a900a85ae063cd3fd29f8eda40b793807679965bfb12afab3cce63fbb3874: Status 404 returned error can't find the container with id 776a900a85ae063cd3fd29f8eda40b793807679965bfb12afab3cce63fbb3874 Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.104176 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:10 crc kubenswrapper[4852]: E1210 11:53:10.104726 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.604704453 +0000 UTC m=+76.690229687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.208189 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:10 crc kubenswrapper[4852]: E1210 11:53:10.208596 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.708580789 +0000 UTC m=+76.794106013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.244967 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6d5jr"] Dec 10 11:53:10 crc kubenswrapper[4852]: W1210 11:53:10.263599 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0161f217_65f3_4afe_8037_281871787a8b.slice/crio-d16e2678bf9ab8b8631bf9f51e55d4a3b92ed74e515db69ff3f345af9795b98e WatchSource:0}: Error finding container d16e2678bf9ab8b8631bf9f51e55d4a3b92ed74e515db69ff3f345af9795b98e: Status 404 returned error can't find the container with id d16e2678bf9ab8b8631bf9f51e55d4a3b92ed74e515db69ff3f345af9795b98e Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.264648 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn5tj" event={"ID":"031e1f57-c87c-4d8f-a05a-380efb0979ec","Type":"ContainerStarted","Data":"776a900a85ae063cd3fd29f8eda40b793807679965bfb12afab3cce63fbb3874"} Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.270104 4852 generic.go:334] "Generic (PLEG): container finished" podID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerID="9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0" exitCode=0 Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.271145 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn8dv" event={"ID":"eb61d8f4-66d0-4d11-955f-4984ab5e18e6","Type":"ContainerDied","Data":"9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0"} Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.272123 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqmsb" event={"ID":"aa4dd551-8252-43a4-b1b3-d4daf088ddd5","Type":"ContainerStarted","Data":"c59853b6a0f45aeca71e3ccfd259fb6b8e366b07e7be2bcc8c23c9a0538ed0ea"} Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.272983 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nczw8" event={"ID":"c4ee2051-d987-4dea-abca-3fcfbdb63ac5","Type":"ContainerStarted","Data":"e78145b8910729a13503ca87ce7bdeb2c39020f0957de0b6e2869c473edd1eda"} Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.309533 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:10 crc kubenswrapper[4852]: E1210 11:53:10.311214 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.811187751 +0000 UTC m=+76.896712975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.411645 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:10 crc kubenswrapper[4852]: E1210 11:53:10.412266 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-10 11:53:10.912219853 +0000 UTC m=+76.997745077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2dss" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.489067 4852 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-10T11:53:09.663650583Z","Handler":null,"Name":""} Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.493273 4852 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.493338 4852 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.513506 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.527516 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.615481 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.632847 4852 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.632888 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.694195 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2dss\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:10 crc kubenswrapper[4852]: I1210 11:53:10.821744 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.048118 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2dss"] Dec 10 11:53:11 crc kubenswrapper[4852]: W1210 11:53:11.052653 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb6ae1b8_eb2a_4790_a39f_37206d33525c.slice/crio-9ff74b7545d4ed40605610e28b657d721a789f1d916f87a4fd550fda87f630d0 WatchSource:0}: Error finding container 9ff74b7545d4ed40605610e28b657d721a789f1d916f87a4fd550fda87f630d0: Status 404 returned error can't find the container with id 9ff74b7545d4ed40605610e28b657d721a789f1d916f87a4fd550fda87f630d0 Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.280682 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" event={"ID":"2141a0da-f43d-4eb3-90a0-338623412c49","Type":"ContainerStarted","Data":"1484fa531b113589f156402d3b2157355b60e2d2f25b5e26e17c1c1408a7dd93"} Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.282329 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81","Type":"ContainerStarted","Data":"f283c71f0bc6e627dbc6c0e6cc5039949662a508d851c97e0a24f249a1e90838"} Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.284122 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bjxbn" event={"ID":"d4917776-2f46-46af-bd13-db5745bfdbf0","Type":"ContainerStarted","Data":"15e0b089fff5fcefd5ae4357b43991f12b65680216442d02811362f2a933067c"} Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.285163 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" event={"ID":"db6ae1b8-eb2a-4790-a39f-37206d33525c","Type":"ContainerStarted","Data":"9ff74b7545d4ed40605610e28b657d721a789f1d916f87a4fd550fda87f630d0"} Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.286172 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d5jr" event={"ID":"0161f217-65f3-4afe-8037-281871787a8b","Type":"ContainerStarted","Data":"d16e2678bf9ab8b8631bf9f51e55d4a3b92ed74e515db69ff3f345af9795b98e"} Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.374774 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.375527 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.378466 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.378577 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.386690 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.424666 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8ac265-f484-42da-98d7-65746d57fc3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c8ac265-f484-42da-98d7-65746d57fc3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.424765 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8ac265-f484-42da-98d7-65746d57fc3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c8ac265-f484-42da-98d7-65746d57fc3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.526089 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8ac265-f484-42da-98d7-65746d57fc3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c8ac265-f484-42da-98d7-65746d57fc3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.526221 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8ac265-f484-42da-98d7-65746d57fc3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c8ac265-f484-42da-98d7-65746d57fc3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.526596 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8ac265-f484-42da-98d7-65746d57fc3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c8ac265-f484-42da-98d7-65746d57fc3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.687561 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8ac265-f484-42da-98d7-65746d57fc3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c8ac265-f484-42da-98d7-65746d57fc3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:11 crc kubenswrapper[4852]: I1210 11:53:11.987426 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:12 crc kubenswrapper[4852]: I1210 11:53:12.177117 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 10 11:53:12 crc kubenswrapper[4852]: I1210 11:53:12.417609 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 10 11:53:13 crc kubenswrapper[4852]: I1210 11:53:13.093675 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7p8tn" Dec 10 11:53:13 crc kubenswrapper[4852]: I1210 11:53:13.485150 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-99cx6" Dec 10 11:53:13 crc kubenswrapper[4852]: I1210 11:53:13.519296 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5c8ac265-f484-42da-98d7-65746d57fc3c","Type":"ContainerStarted","Data":"d83362e18462a79286f7b72ca3a92d99c7732ae10b99d25a396e44864888bef5"} Dec 10 11:53:13 crc kubenswrapper[4852]: I1210 11:53:13.525421 4852 generic.go:334] "Generic (PLEG): container finished" podID="a3d28862-df31-4d6c-af29-5fa5b49104ae" containerID="6aadf508c61ddfe3c85a5a956f8a4ae044e4c217523b4ad73cc35db9b6dcdbe4" exitCode=0 Dec 10 11:53:13 crc kubenswrapper[4852]: I1210 11:53:13.525565 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" event={"ID":"a3d28862-df31-4d6c-af29-5fa5b49104ae","Type":"ContainerDied","Data":"6aadf508c61ddfe3c85a5a956f8a4ae044e4c217523b4ad73cc35db9b6dcdbe4"} Dec 10 11:53:13 crc kubenswrapper[4852]: I1210 11:53:13.528420 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nczw8" event={"ID":"c4ee2051-d987-4dea-abca-3fcfbdb63ac5","Type":"ContainerStarted","Data":"728e4b041fe8b805f8c87dc83ffe25e435b00375a90badc0feaeb328fb71fe13"} Dec 10 11:53:14 crc kubenswrapper[4852]: I1210 11:53:14.536105 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqmsb" event={"ID":"aa4dd551-8252-43a4-b1b3-d4daf088ddd5","Type":"ContainerStarted","Data":"3ae0d1927db387b3f221b0f878ada285ad5efc1b0b644e43187477463b7238e6"} Dec 10 11:53:14 crc kubenswrapper[4852]: I1210 11:53:14.537729 4852 generic.go:334] "Generic (PLEG): container finished" podID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerID="728e4b041fe8b805f8c87dc83ffe25e435b00375a90badc0feaeb328fb71fe13" exitCode=0 Dec 10 11:53:14 crc kubenswrapper[4852]: I1210 11:53:14.537781 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nczw8" event={"ID":"c4ee2051-d987-4dea-abca-3fcfbdb63ac5","Type":"ContainerDied","Data":"728e4b041fe8b805f8c87dc83ffe25e435b00375a90badc0feaeb328fb71fe13"} Dec 10 11:53:14 crc kubenswrapper[4852]: I1210 11:53:14.540778 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn5tj" event={"ID":"031e1f57-c87c-4d8f-a05a-380efb0979ec","Type":"ContainerStarted","Data":"4ec937935dc93cbfb6ee95c4d0242123a54660befb64c511bb947add052b7d08"} Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.076512 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.198886 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3d28862-df31-4d6c-af29-5fa5b49104ae-secret-volume\") pod \"a3d28862-df31-4d6c-af29-5fa5b49104ae\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.198983 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3d28862-df31-4d6c-af29-5fa5b49104ae-config-volume\") pod \"a3d28862-df31-4d6c-af29-5fa5b49104ae\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.199050 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nm85\" (UniqueName: \"kubernetes.io/projected/a3d28862-df31-4d6c-af29-5fa5b49104ae-kube-api-access-8nm85\") pod \"a3d28862-df31-4d6c-af29-5fa5b49104ae\" (UID: \"a3d28862-df31-4d6c-af29-5fa5b49104ae\") " Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.199944 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3d28862-df31-4d6c-af29-5fa5b49104ae-config-volume" (OuterVolumeSpecName: "config-volume") pod "a3d28862-df31-4d6c-af29-5fa5b49104ae" (UID: "a3d28862-df31-4d6c-af29-5fa5b49104ae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.204877 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d28862-df31-4d6c-af29-5fa5b49104ae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a3d28862-df31-4d6c-af29-5fa5b49104ae" (UID: "a3d28862-df31-4d6c-af29-5fa5b49104ae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.208739 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d28862-df31-4d6c-af29-5fa5b49104ae-kube-api-access-8nm85" (OuterVolumeSpecName: "kube-api-access-8nm85") pod "a3d28862-df31-4d6c-af29-5fa5b49104ae" (UID: "a3d28862-df31-4d6c-af29-5fa5b49104ae"). InnerVolumeSpecName "kube-api-access-8nm85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.300390 4852 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3d28862-df31-4d6c-af29-5fa5b49104ae-config-volume\") on node \"crc\" DevicePath \"\"" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.300707 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nm85\" (UniqueName: \"kubernetes.io/projected/a3d28862-df31-4d6c-af29-5fa5b49104ae-kube-api-access-8nm85\") on node \"crc\" DevicePath \"\"" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.300721 4852 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3d28862-df31-4d6c-af29-5fa5b49104ae-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.548928 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" event={"ID":"a3d28862-df31-4d6c-af29-5fa5b49104ae","Type":"ContainerDied","Data":"bf5269bcf90b1f2df75910c3eb17f1c2daadfbc6c39ce46fe3e1d6716537423a"} Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.548981 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf5269bcf90b1f2df75910c3eb17f1c2daadfbc6c39ce46fe3e1d6716537423a" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.549019 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c" Dec 10 11:53:15 crc kubenswrapper[4852]: I1210 11:53:15.555331 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d5jr" event={"ID":"0161f217-65f3-4afe-8037-281871787a8b","Type":"ContainerStarted","Data":"4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418"} Dec 10 11:53:16 crc kubenswrapper[4852]: I1210 11:53:16.564273 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:16 crc kubenswrapper[4852]: I1210 11:53:16.564322 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:16 crc kubenswrapper[4852]: I1210 11:53:16.564699 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:16 crc kubenswrapper[4852]: I1210 11:53:16.564755 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:16 crc kubenswrapper[4852]: I1210 11:53:16.567029 4852 generic.go:334] "Generic (PLEG): container finished" podID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerID="4ec937935dc93cbfb6ee95c4d0242123a54660befb64c511bb947add052b7d08" exitCode=0 Dec 10 11:53:16 crc kubenswrapper[4852]: I1210 11:53:16.567074 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn5tj" event={"ID":"031e1f57-c87c-4d8f-a05a-380efb0979ec","Type":"ContainerDied","Data":"4ec937935dc93cbfb6ee95c4d0242123a54660befb64c511bb947add052b7d08"} Dec 10 11:53:16 crc kubenswrapper[4852]: I1210 11:53:16.619189 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:53:16 crc kubenswrapper[4852]: I1210 11:53:16.624320 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-c82cd" Dec 10 11:53:18 crc kubenswrapper[4852]: E1210 11:53:18.200261 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:18 crc kubenswrapper[4852]: E1210 11:53:18.208044 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:18 crc kubenswrapper[4852]: E1210 11:53:18.209674 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:18 crc kubenswrapper[4852]: E1210 11:53:18.209745 4852 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:53:18 crc kubenswrapper[4852]: I1210 11:53:18.459429 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:53:18 crc kubenswrapper[4852]: I1210 11:53:18.592950 4852 generic.go:334] "Generic (PLEG): container finished" podID="0161f217-65f3-4afe-8037-281871787a8b" containerID="4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418" exitCode=0 Dec 10 11:53:18 crc kubenswrapper[4852]: I1210 11:53:18.593082 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d5jr" event={"ID":"0161f217-65f3-4afe-8037-281871787a8b","Type":"ContainerDied","Data":"4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418"} Dec 10 11:53:18 crc kubenswrapper[4852]: I1210 11:53:18.596125 4852 generic.go:334] "Generic (PLEG): container finished" podID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerID="3ae0d1927db387b3f221b0f878ada285ad5efc1b0b644e43187477463b7238e6" exitCode=0 Dec 10 11:53:18 crc kubenswrapper[4852]: I1210 11:53:18.596163 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqmsb" event={"ID":"aa4dd551-8252-43a4-b1b3-d4daf088ddd5","Type":"ContainerDied","Data":"3ae0d1927db387b3f221b0f878ada285ad5efc1b0b644e43187477463b7238e6"} Dec 10 11:53:19 crc kubenswrapper[4852]: I1210 11:53:19.195275 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 10 11:53:19 crc kubenswrapper[4852]: I1210 11:53:19.611649 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" event={"ID":"db6ae1b8-eb2a-4790-a39f-37206d33525c","Type":"ContainerStarted","Data":"bf36db3f6de83443fd52ea4f6dca05f6b14e9506bb30c879b3f6bd8bf3a64653"} Dec 10 11:53:19 crc kubenswrapper[4852]: I1210 11:53:19.614007 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5c8ac265-f484-42da-98d7-65746d57fc3c","Type":"ContainerStarted","Data":"0893f01923c03a568cdc4982853c9786593eeb6bd067bbf3ccb33e73ecc93eef"} Dec 10 11:53:19 crc kubenswrapper[4852]: I1210 11:53:19.615964 4852 generic.go:334] "Generic (PLEG): container finished" podID="d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81" containerID="f283c71f0bc6e627dbc6c0e6cc5039949662a508d851c97e0a24f249a1e90838" exitCode=0 Dec 10 11:53:19 crc kubenswrapper[4852]: I1210 11:53:19.616366 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81","Type":"ContainerDied","Data":"f283c71f0bc6e627dbc6c0e6cc5039949662a508d851c97e0a24f249a1e90838"} Dec 10 11:53:19 crc kubenswrapper[4852]: I1210 11:53:19.655068 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bjxbn" podStartSLOduration=62.655047057 podStartE2EDuration="1m2.655047057s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:19.635711742 +0000 UTC m=+85.721236976" watchObservedRunningTime="2025-12-10 11:53:19.655047057 +0000 UTC m=+85.740572301" Dec 10 11:53:19 crc kubenswrapper[4852]: I1210 11:53:19.710003 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.709982724 podStartE2EDuration="709.982724ms" podCreationTimestamp="2025-12-10 11:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:19.705778624 +0000 UTC m=+85.791303848" watchObservedRunningTime="2025-12-10 11:53:19.709982724 +0000 UTC m=+85.795507958" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.342369 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.342840 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.349902 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.385592 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.447733 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.447852 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.476852 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.483749 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.490387 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.495600 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.501818 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.735997 4852 generic.go:334] "Generic (PLEG): container finished" podID="5c8ac265-f484-42da-98d7-65746d57fc3c" containerID="0893f01923c03a568cdc4982853c9786593eeb6bd067bbf3ccb33e73ecc93eef" exitCode=0 Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.736079 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5c8ac265-f484-42da-98d7-65746d57fc3c","Type":"ContainerDied","Data":"0893f01923c03a568cdc4982853c9786593eeb6bd067bbf3ccb33e73ecc93eef"} Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.765802 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" podStartSLOduration=63.765781136 podStartE2EDuration="1m3.765781136s" podCreationTimestamp="2025-12-10 11:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:20.765376065 +0000 UTC m=+86.850901289" watchObservedRunningTime="2025-12-10 11:53:20.765781136 +0000 UTC m=+86.851306360" Dec 10 11:53:20 crc kubenswrapper[4852]: I1210 11:53:20.822146 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.191498 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.564432 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.564511 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.564564 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.564663 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.564714 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.564981 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.565025 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.565154 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"80ae4510e1e7b3a030fc569910f8f8817364c74b8b81e21b907b1a693d80eb92"} pod="openshift-console/downloads-7954f5f757-ndbzv" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 10 11:53:26 crc kubenswrapper[4852]: I1210 11:53:26.565282 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" containerID="cri-o://80ae4510e1e7b3a030fc569910f8f8817364c74b8b81e21b907b1a693d80eb92" gracePeriod=2 Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.488945 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.522620 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.5225858570000002 podStartE2EDuration="1.522585857s" podCreationTimestamp="2025-12-10 11:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:27.520244785 +0000 UTC m=+93.605770019" watchObservedRunningTime="2025-12-10 11:53:27.522585857 +0000 UTC m=+93.608111081" Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.652259 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8ac265-f484-42da-98d7-65746d57fc3c-kube-api-access\") pod \"5c8ac265-f484-42da-98d7-65746d57fc3c\" (UID: \"5c8ac265-f484-42da-98d7-65746d57fc3c\") " Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.652364 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8ac265-f484-42da-98d7-65746d57fc3c-kubelet-dir\") pod \"5c8ac265-f484-42da-98d7-65746d57fc3c\" (UID: \"5c8ac265-f484-42da-98d7-65746d57fc3c\") " Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.653017 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c8ac265-f484-42da-98d7-65746d57fc3c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c8ac265-f484-42da-98d7-65746d57fc3c" (UID: "5c8ac265-f484-42da-98d7-65746d57fc3c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.658923 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8ac265-f484-42da-98d7-65746d57fc3c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c8ac265-f484-42da-98d7-65746d57fc3c" (UID: "5c8ac265-f484-42da-98d7-65746d57fc3c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.754262 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8ac265-f484-42da-98d7-65746d57fc3c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.754308 4852 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8ac265-f484-42da-98d7-65746d57fc3c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.794353 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5c8ac265-f484-42da-98d7-65746d57fc3c","Type":"ContainerDied","Data":"d83362e18462a79286f7b72ca3a92d99c7732ae10b99d25a396e44864888bef5"} Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.794399 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d83362e18462a79286f7b72ca3a92d99c7732ae10b99d25a396e44864888bef5" Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.794369 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.796785 4852 generic.go:334] "Generic (PLEG): container finished" podID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerID="80ae4510e1e7b3a030fc569910f8f8817364c74b8b81e21b907b1a693d80eb92" exitCode=0 Dec 10 11:53:27 crc kubenswrapper[4852]: I1210 11:53:27.796830 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ndbzv" event={"ID":"eaf52478-5cc3-48c5-9f24-fc1ad41a3601","Type":"ContainerDied","Data":"80ae4510e1e7b3a030fc569910f8f8817364c74b8b81e21b907b1a693d80eb92"} Dec 10 11:53:28 crc kubenswrapper[4852]: E1210 11:53:28.199910 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:28 crc kubenswrapper[4852]: E1210 11:53:28.201403 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:28 crc kubenswrapper[4852]: E1210 11:53:28.203405 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:28 crc kubenswrapper[4852]: E1210 11:53:28.203505 4852 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.449862 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.587855 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kubelet-dir\") pod \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\" (UID: \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\") " Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.588396 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kube-api-access\") pod \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\" (UID: \"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81\") " Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.588025 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81" (UID: "d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.588788 4852 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.597422 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81" (UID: "d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.689662 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.812849 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81","Type":"ContainerDied","Data":"0336674725cf286d5c73beff413e2d5b35f5e88d534590b6eba1b2487ecc806e"} Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.812894 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0336674725cf286d5c73beff413e2d5b35f5e88d534590b6eba1b2487ecc806e" Dec 10 11:53:30 crc kubenswrapper[4852]: I1210 11:53:30.812946 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 10 11:53:36 crc kubenswrapper[4852]: I1210 11:53:36.564623 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:36 crc kubenswrapper[4852]: I1210 11:53:36.564926 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:37 crc kubenswrapper[4852]: I1210 11:53:37.855735 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" event={"ID":"2141a0da-f43d-4eb3-90a0-338623412c49","Type":"ContainerStarted","Data":"cc9209e6bd80870f98c0f0648a569a3557636e780cfd2260da07e769272d3769"} Dec 10 11:53:38 crc kubenswrapper[4852]: I1210 11:53:38.086290 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hlxf6" Dec 10 11:53:38 crc kubenswrapper[4852]: E1210 11:53:38.198590 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:38 crc kubenswrapper[4852]: E1210 11:53:38.199869 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:38 crc kubenswrapper[4852]: E1210 11:53:38.201290 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:38 crc kubenswrapper[4852]: E1210 11:53:38.201592 4852 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:53:40 crc kubenswrapper[4852]: I1210 11:53:40.830172 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.130368 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" podStartSLOduration=52.130346132 podStartE2EDuration="52.130346132s" podCreationTimestamp="2025-12-10 11:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:53:43.051627734 +0000 UTC m=+109.137152958" watchObservedRunningTime="2025-12-10 11:53:46.130346132 +0000 UTC m=+112.215871356" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.132813 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 10 11:53:46 crc kubenswrapper[4852]: E1210 11:53:46.133037 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d28862-df31-4d6c-af29-5fa5b49104ae" containerName="collect-profiles" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.133052 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d28862-df31-4d6c-af29-5fa5b49104ae" containerName="collect-profiles" Dec 10 11:53:46 crc kubenswrapper[4852]: E1210 11:53:46.133061 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8ac265-f484-42da-98d7-65746d57fc3c" containerName="pruner" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.133067 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8ac265-f484-42da-98d7-65746d57fc3c" containerName="pruner" Dec 10 11:53:46 crc kubenswrapper[4852]: E1210 11:53:46.133084 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81" containerName="pruner" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.133090 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81" containerName="pruner" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.133223 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d28862-df31-4d6c-af29-5fa5b49104ae" containerName="collect-profiles" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.133274 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8ac265-f484-42da-98d7-65746d57fc3c" containerName="pruner" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.133290 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ab7907-ce71-4fd2-a5ca-bb60f1ca7d81" containerName="pruner" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.134163 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.136996 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.137220 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.141271 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.233295 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.233431 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.334798 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.334857 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.334912 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.367829 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.503894 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.564386 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:46 crc kubenswrapper[4852]: I1210 11:53:46.564462 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:47 crc kubenswrapper[4852]: I1210 11:53:47.262770 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-2tlvx" podUID="2141a0da-f43d-4eb3-90a0-338623412c49" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.43:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 10 11:53:48 crc kubenswrapper[4852]: E1210 11:53:48.201920 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:48 crc kubenswrapper[4852]: E1210 11:53:48.206021 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:48 crc kubenswrapper[4852]: E1210 11:53:48.208452 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:48 crc kubenswrapper[4852]: E1210 11:53:48.208580 4852 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.326633 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.327500 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.338655 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.405584 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kubelet-dir\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.405675 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kube-api-access\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.405733 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-var-lock\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.507197 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-var-lock\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.507617 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kubelet-dir\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.507384 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-var-lock\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.507772 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kube-api-access\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.508040 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kubelet-dir\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.530282 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kube-api-access\") pod \"installer-9-crc\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:51 crc kubenswrapper[4852]: I1210 11:53:51.656549 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:53:55 crc kubenswrapper[4852]: I1210 11:53:55.145874 4852 trace.go:236] Trace[1930790331]: "Calculate volume metrics of cni-sysctl-allowlist for pod openshift-multus/cni-sysctl-allowlist-ds-rd2bl" (10-Dec-2025 11:53:54.126) (total time: 1019ms): Dec 10 11:53:55 crc kubenswrapper[4852]: Trace[1930790331]: [1.019824434s] [1.019824434s] END Dec 10 11:53:55 crc kubenswrapper[4852]: I1210 11:53:55.146550 4852 trace.go:236] Trace[1581521943]: "Calculate volume metrics of trusted-ca for pod openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9zrxr" (10-Dec-2025 11:53:54.128) (total time: 1018ms): Dec 10 11:53:55 crc kubenswrapper[4852]: Trace[1581521943]: [1.018359185s] [1.018359185s] END Dec 10 11:53:56 crc kubenswrapper[4852]: I1210 11:53:56.565399 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:53:56 crc kubenswrapper[4852]: I1210 11:53:56.566380 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:53:58 crc kubenswrapper[4852]: E1210 11:53:58.200527 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:58 crc kubenswrapper[4852]: E1210 11:53:58.202386 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:58 crc kubenswrapper[4852]: E1210 11:53:58.204713 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:53:58 crc kubenswrapper[4852]: E1210 11:53:58.204816 4852 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:54:03 crc kubenswrapper[4852]: I1210 11:54:03.146040 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-rd2bl_3d331406-40f3-46fa-b660-f4cf0813d332/kube-multus-additional-cni-plugins/0.log" Dec 10 11:54:03 crc kubenswrapper[4852]: I1210 11:54:03.147111 4852 generic.go:334] "Generic (PLEG): container finished" podID="3d331406-40f3-46fa-b660-f4cf0813d332" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" exitCode=137 Dec 10 11:54:03 crc kubenswrapper[4852]: I1210 11:54:03.147147 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" event={"ID":"3d331406-40f3-46fa-b660-f4cf0813d332","Type":"ContainerDied","Data":"d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0"} Dec 10 11:54:06 crc kubenswrapper[4852]: I1210 11:54:06.565049 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:54:06 crc kubenswrapper[4852]: I1210 11:54:06.565445 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:54:08 crc kubenswrapper[4852]: E1210 11:54:08.197268 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0 is running failed: container process not found" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:54:08 crc kubenswrapper[4852]: E1210 11:54:08.197687 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0 is running failed: container process not found" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:54:08 crc kubenswrapper[4852]: E1210 11:54:08.198177 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0 is running failed: container process not found" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 10 11:54:08 crc kubenswrapper[4852]: E1210 11:54:08.198207 4852 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:54:10 crc kubenswrapper[4852]: E1210 11:54:10.948622 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 10 11:54:10 crc kubenswrapper[4852]: E1210 11:54:10.949114 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5xfrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xmxbw_openshift-marketplace(b65cc728-9de3-466f-902b-47f30708118c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 10 11:54:10 crc kubenswrapper[4852]: E1210 11:54:10.950376 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xmxbw" podUID="b65cc728-9de3-466f-902b-47f30708118c" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.001153 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xmxbw" podUID="b65cc728-9de3-466f-902b-47f30708118c" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.088201 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-rd2bl_3d331406-40f3-46fa-b660-f4cf0813d332/kube-multus-additional-cni-plugins/0.log" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.088642 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.094264 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.094455 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5tswc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dn8dv_openshift-marketplace(eb61d8f4-66d0-4d11-955f-4984ab5e18e6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.096677 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dn8dv" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.098630 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.098827 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2rq2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-662t9_openshift-marketplace(2f614760-033c-494e-81d4-11c997e0db34): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.100058 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-662t9" podUID="2f614760-033c-494e-81d4-11c997e0db34" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.118023 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.119692 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jk2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-h2rb2_openshift-marketplace(4aa25f22-5823-46d9-ae2b-a507642dc0df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 10 11:54:14 crc kubenswrapper[4852]: E1210 11:54:14.121302 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-h2rb2" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.157812 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d331406-40f3-46fa-b660-f4cf0813d332-tuning-conf-dir\") pod \"3d331406-40f3-46fa-b660-f4cf0813d332\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.158746 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3d331406-40f3-46fa-b660-f4cf0813d332-cni-sysctl-allowlist\") pod \"3d331406-40f3-46fa-b660-f4cf0813d332\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.158791 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq9vl\" (UniqueName: \"kubernetes.io/projected/3d331406-40f3-46fa-b660-f4cf0813d332-kube-api-access-sq9vl\") pod \"3d331406-40f3-46fa-b660-f4cf0813d332\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.158034 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d331406-40f3-46fa-b660-f4cf0813d332-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "3d331406-40f3-46fa-b660-f4cf0813d332" (UID: "3d331406-40f3-46fa-b660-f4cf0813d332"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.161703 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d331406-40f3-46fa-b660-f4cf0813d332-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "3d331406-40f3-46fa-b660-f4cf0813d332" (UID: "3d331406-40f3-46fa-b660-f4cf0813d332"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.164684 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d331406-40f3-46fa-b660-f4cf0813d332-kube-api-access-sq9vl" (OuterVolumeSpecName: "kube-api-access-sq9vl") pod "3d331406-40f3-46fa-b660-f4cf0813d332" (UID: "3d331406-40f3-46fa-b660-f4cf0813d332"). InnerVolumeSpecName "kube-api-access-sq9vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.235084 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-rd2bl_3d331406-40f3-46fa-b660-f4cf0813d332/kube-multus-additional-cni-plugins/0.log" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.236104 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.236295 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rd2bl" event={"ID":"3d331406-40f3-46fa-b660-f4cf0813d332","Type":"ContainerDied","Data":"366e04e47e794072b6b2620815d25ba976ea13a369a82d5c75d3606e7b80b2f7"} Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.236359 4852 scope.go:117] "RemoveContainer" containerID="d6784cf4a66da05f6561f07c8fa0a166c537de0f19e1e5af385b1f34e41db7a0" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.266370 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3d331406-40f3-46fa-b660-f4cf0813d332-ready\") pod \"3d331406-40f3-46fa-b660-f4cf0813d332\" (UID: \"3d331406-40f3-46fa-b660-f4cf0813d332\") " Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.267536 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d331406-40f3-46fa-b660-f4cf0813d332-ready" (OuterVolumeSpecName: "ready") pod "3d331406-40f3-46fa-b660-f4cf0813d332" (UID: "3d331406-40f3-46fa-b660-f4cf0813d332"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.269125 4852 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3d331406-40f3-46fa-b660-f4cf0813d332-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.269162 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq9vl\" (UniqueName: \"kubernetes.io/projected/3d331406-40f3-46fa-b660-f4cf0813d332-kube-api-access-sq9vl\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.269173 4852 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3d331406-40f3-46fa-b660-f4cf0813d332-ready\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.269185 4852 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d331406-40f3-46fa-b660-f4cf0813d332-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.573135 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rd2bl"] Dec 10 11:54:14 crc kubenswrapper[4852]: I1210 11:54:14.578284 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rd2bl"] Dec 10 11:54:16 crc kubenswrapper[4852]: I1210 11:54:16.177874 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" path="/var/lib/kubelet/pods/3d331406-40f3-46fa-b660-f4cf0813d332/volumes" Dec 10 11:54:16 crc kubenswrapper[4852]: I1210 11:54:16.565247 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:54:16 crc kubenswrapper[4852]: I1210 11:54:16.565312 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:54:18 crc kubenswrapper[4852]: E1210 11:54:18.056911 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dn8dv" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" Dec 10 11:54:18 crc kubenswrapper[4852]: E1210 11:54:18.056911 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-h2rb2" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" Dec 10 11:54:18 crc kubenswrapper[4852]: E1210 11:54:18.056952 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-662t9" podUID="2f614760-033c-494e-81d4-11c997e0db34" Dec 10 11:54:19 crc kubenswrapper[4852]: W1210 11:54:19.320643 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-1c5c3514464bccd5c98f1ec1f789bc9c31162afcdaf166d0c2579fdac8e473fd WatchSource:0}: Error finding container 1c5c3514464bccd5c98f1ec1f789bc9c31162afcdaf166d0c2579fdac8e473fd: Status 404 returned error can't find the container with id 1c5c3514464bccd5c98f1ec1f789bc9c31162afcdaf166d0c2579fdac8e473fd Dec 10 11:54:19 crc kubenswrapper[4852]: E1210 11:54:19.324483 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 10 11:54:19 crc kubenswrapper[4852]: E1210 11:54:19.324703 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-29h7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mqmsb_openshift-marketplace(aa4dd551-8252-43a4-b1b3-d4daf088ddd5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 10 11:54:19 crc kubenswrapper[4852]: E1210 11:54:19.325794 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mqmsb" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" Dec 10 11:54:19 crc kubenswrapper[4852]: I1210 11:54:19.585422 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 10 11:54:19 crc kubenswrapper[4852]: W1210 11:54:19.627896 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-aaff01bef8ed6e530085c2f2d925326752571e60bdf849812c2291df24fda0b7 WatchSource:0}: Error finding container aaff01bef8ed6e530085c2f2d925326752571e60bdf849812c2291df24fda0b7: Status 404 returned error can't find the container with id aaff01bef8ed6e530085c2f2d925326752571e60bdf849812c2291df24fda0b7 Dec 10 11:54:19 crc kubenswrapper[4852]: I1210 11:54:19.831028 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 10 11:54:19 crc kubenswrapper[4852]: W1210 11:54:19.846893 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbe083c0f_9fb1_4b9a_9b5b_76bbacc4f009.slice/crio-63ed3d02305699fcc5be12bb0a3c5ab1e7d518a1b1c3b726194d5a435f65ff44 WatchSource:0}: Error finding container 63ed3d02305699fcc5be12bb0a3c5ab1e7d518a1b1c3b726194d5a435f65ff44: Status 404 returned error can't find the container with id 63ed3d02305699fcc5be12bb0a3c5ab1e7d518a1b1c3b726194d5a435f65ff44 Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.287469 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nczw8" event={"ID":"c4ee2051-d987-4dea-abca-3fcfbdb63ac5","Type":"ContainerStarted","Data":"5549465fd5846ccc01491e450854222fd19153fac1a641123562fd2b79a9e4bb"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.294534 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d5jr" event={"ID":"0161f217-65f3-4afe-8037-281871787a8b","Type":"ContainerStarted","Data":"37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.300145 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"14c618e4dd7b95815137e70448b51e9fcaf3068e157dffc3f51e6ac41604e087"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.300212 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f9cd5ba6bf0bd6d72991927ff69516b390498e0df7b08e9bcb82667e06f8907f"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.324909 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009","Type":"ContainerStarted","Data":"fd2ddf289706b283826aacdc3289f66e2d0a0a5367ac0b1c041085e5fa716db4"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.324953 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009","Type":"ContainerStarted","Data":"63ed3d02305699fcc5be12bb0a3c5ab1e7d518a1b1c3b726194d5a435f65ff44"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.340762 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3c622b66-ddfa-4f85-b9fe-baaaaed40eea","Type":"ContainerStarted","Data":"d06a918ce4d0c047edb8616f0df10facfb7667051a8f959d0bbf213011b40425"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.340837 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3c622b66-ddfa-4f85-b9fe-baaaaed40eea","Type":"ContainerStarted","Data":"f7ed7fedb4afbb5feb187de6668a65faf8fde00b4360418018d6a780d262bda9"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.353173 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ndbzv" event={"ID":"eaf52478-5cc3-48c5-9f24-fc1ad41a3601","Type":"ContainerStarted","Data":"e10c24d15413d9709fd4c2fb212223aa665cb17673e4e852e7b26f2360ef5d59"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.353264 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.355409 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.355468 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.366186 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"992e4dbf934400cec3764f355b3f4cb4a07b71bee5518b6967fef5a1aa339a5c"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.366336 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aaff01bef8ed6e530085c2f2d925326752571e60bdf849812c2291df24fda0b7"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.367285 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.377820 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c9d923fdf2444c9a862b6e85cb6fa04fa81d4088b1f10c052f45e11dbc32d36c"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.377885 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1c5c3514464bccd5c98f1ec1f789bc9c31162afcdaf166d0c2579fdac8e473fd"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.386339 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=29.386317842 podStartE2EDuration="29.386317842s" podCreationTimestamp="2025-12-10 11:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:54:20.38411421 +0000 UTC m=+146.469639464" watchObservedRunningTime="2025-12-10 11:54:20.386317842 +0000 UTC m=+146.471843066" Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.387513 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn5tj" event={"ID":"031e1f57-c87c-4d8f-a05a-380efb0979ec","Type":"ContainerStarted","Data":"f3dc9326df04b4e8e1aa54ba578e2bf73788f9561e0ec9741ebd4a9486add6bb"} Dec 10 11:54:20 crc kubenswrapper[4852]: I1210 11:54:20.525142 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=34.525114145 podStartE2EDuration="34.525114145s" podCreationTimestamp="2025-12-10 11:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:54:20.520780722 +0000 UTC m=+146.606305946" watchObservedRunningTime="2025-12-10 11:54:20.525114145 +0000 UTC m=+146.610639389" Dec 10 11:54:21 crc kubenswrapper[4852]: I1210 11:54:21.401514 4852 generic.go:334] "Generic (PLEG): container finished" podID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerID="5549465fd5846ccc01491e450854222fd19153fac1a641123562fd2b79a9e4bb" exitCode=0 Dec 10 11:54:21 crc kubenswrapper[4852]: I1210 11:54:21.401790 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nczw8" event={"ID":"c4ee2051-d987-4dea-abca-3fcfbdb63ac5","Type":"ContainerDied","Data":"5549465fd5846ccc01491e450854222fd19153fac1a641123562fd2b79a9e4bb"} Dec 10 11:54:21 crc kubenswrapper[4852]: I1210 11:54:21.404860 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:54:21 crc kubenswrapper[4852]: I1210 11:54:21.404919 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:54:22 crc kubenswrapper[4852]: I1210 11:54:22.434301 4852 generic.go:334] "Generic (PLEG): container finished" podID="3c622b66-ddfa-4f85-b9fe-baaaaed40eea" containerID="d06a918ce4d0c047edb8616f0df10facfb7667051a8f959d0bbf213011b40425" exitCode=0 Dec 10 11:54:22 crc kubenswrapper[4852]: I1210 11:54:22.434522 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3c622b66-ddfa-4f85-b9fe-baaaaed40eea","Type":"ContainerDied","Data":"d06a918ce4d0c047edb8616f0df10facfb7667051a8f959d0bbf213011b40425"} Dec 10 11:54:23 crc kubenswrapper[4852]: I1210 11:54:23.444481 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nczw8" event={"ID":"c4ee2051-d987-4dea-abca-3fcfbdb63ac5","Type":"ContainerStarted","Data":"4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024"} Dec 10 11:54:24 crc kubenswrapper[4852]: I1210 11:54:24.596473 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:54:24 crc kubenswrapper[4852]: I1210 11:54:24.616775 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nczw8" podStartSLOduration=22.167259131 podStartE2EDuration="1m16.61675295s" podCreationTimestamp="2025-12-10 11:53:08 +0000 UTC" firstStartedPulling="2025-12-10 11:53:27.436849815 +0000 UTC m=+93.522375039" lastFinishedPulling="2025-12-10 11:54:21.886343634 +0000 UTC m=+147.971868858" observedRunningTime="2025-12-10 11:54:23.493357626 +0000 UTC m=+149.578882850" watchObservedRunningTime="2025-12-10 11:54:24.61675295 +0000 UTC m=+150.702278194" Dec 10 11:54:24 crc kubenswrapper[4852]: I1210 11:54:24.720328 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kube-api-access\") pod \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\" (UID: \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\") " Dec 10 11:54:24 crc kubenswrapper[4852]: I1210 11:54:24.720623 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kubelet-dir\") pod \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\" (UID: \"3c622b66-ddfa-4f85-b9fe-baaaaed40eea\") " Dec 10 11:54:24 crc kubenswrapper[4852]: I1210 11:54:24.720978 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3c622b66-ddfa-4f85-b9fe-baaaaed40eea" (UID: "3c622b66-ddfa-4f85-b9fe-baaaaed40eea"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:54:24 crc kubenswrapper[4852]: I1210 11:54:24.740014 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3c622b66-ddfa-4f85-b9fe-baaaaed40eea" (UID: "3c622b66-ddfa-4f85-b9fe-baaaaed40eea"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:54:24 crc kubenswrapper[4852]: I1210 11:54:24.822112 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:24 crc kubenswrapper[4852]: I1210 11:54:24.822161 4852 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c622b66-ddfa-4f85-b9fe-baaaaed40eea-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:25 crc kubenswrapper[4852]: I1210 11:54:25.536468 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3c622b66-ddfa-4f85-b9fe-baaaaed40eea","Type":"ContainerDied","Data":"f7ed7fedb4afbb5feb187de6668a65faf8fde00b4360418018d6a780d262bda9"} Dec 10 11:54:25 crc kubenswrapper[4852]: I1210 11:54:25.536514 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7ed7fedb4afbb5feb187de6668a65faf8fde00b4360418018d6a780d262bda9" Dec 10 11:54:25 crc kubenswrapper[4852]: I1210 11:54:25.536593 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 10 11:54:26 crc kubenswrapper[4852]: I1210 11:54:26.547923 4852 generic.go:334] "Generic (PLEG): container finished" podID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerID="f3dc9326df04b4e8e1aa54ba578e2bf73788f9561e0ec9741ebd4a9486add6bb" exitCode=0 Dec 10 11:54:26 crc kubenswrapper[4852]: I1210 11:54:26.548017 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn5tj" event={"ID":"031e1f57-c87c-4d8f-a05a-380efb0979ec","Type":"ContainerDied","Data":"f3dc9326df04b4e8e1aa54ba578e2bf73788f9561e0ec9741ebd4a9486add6bb"} Dec 10 11:54:26 crc kubenswrapper[4852]: I1210 11:54:26.564575 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:54:26 crc kubenswrapper[4852]: I1210 11:54:26.564800 4852 patch_prober.go:28] interesting pod/downloads-7954f5f757-ndbzv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 10 11:54:26 crc kubenswrapper[4852]: I1210 11:54:26.564675 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:54:26 crc kubenswrapper[4852]: I1210 11:54:26.564908 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ndbzv" podUID="eaf52478-5cc3-48c5-9f24-fc1ad41a3601" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 10 11:54:26 crc kubenswrapper[4852]: I1210 11:54:26.566558 4852 generic.go:334] "Generic (PLEG): container finished" podID="0161f217-65f3-4afe-8037-281871787a8b" containerID="37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f" exitCode=0 Dec 10 11:54:26 crc kubenswrapper[4852]: I1210 11:54:26.566631 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d5jr" event={"ID":"0161f217-65f3-4afe-8037-281871787a8b","Type":"ContainerDied","Data":"37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f"} Dec 10 11:54:28 crc kubenswrapper[4852]: I1210 11:54:28.645336 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:54:28 crc kubenswrapper[4852]: I1210 11:54:28.645654 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:54:30 crc kubenswrapper[4852]: I1210 11:54:30.160366 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:54:30 crc kubenswrapper[4852]: I1210 11:54:30.231400 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:54:30 crc kubenswrapper[4852]: I1210 11:54:30.406730 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nczw8"] Dec 10 11:54:31 crc kubenswrapper[4852]: I1210 11:54:31.591519 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nczw8" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="registry-server" containerID="cri-o://4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024" gracePeriod=2 Dec 10 11:54:35 crc kubenswrapper[4852]: I1210 11:54:35.614517 4852 generic.go:334] "Generic (PLEG): container finished" podID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerID="4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024" exitCode=0 Dec 10 11:54:35 crc kubenswrapper[4852]: I1210 11:54:35.614597 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nczw8" event={"ID":"c4ee2051-d987-4dea-abca-3fcfbdb63ac5","Type":"ContainerDied","Data":"4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024"} Dec 10 11:54:36 crc kubenswrapper[4852]: I1210 11:54:36.585054 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ndbzv" Dec 10 11:54:38 crc kubenswrapper[4852]: E1210 11:54:38.646052 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024 is running failed: container process not found" containerID="4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 11:54:38 crc kubenswrapper[4852]: E1210 11:54:38.646692 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024 is running failed: container process not found" containerID="4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 11:54:38 crc kubenswrapper[4852]: E1210 11:54:38.647644 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024 is running failed: container process not found" containerID="4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 11:54:38 crc kubenswrapper[4852]: E1210 11:54:38.647680 4852 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-nczw8" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="registry-server" Dec 10 11:54:45 crc kubenswrapper[4852]: I1210 11:54:45.789847 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 11:54:45 crc kubenswrapper[4852]: I1210 11:54:45.790421 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 11:54:47 crc kubenswrapper[4852]: I1210 11:54:47.808461 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:54:47 crc kubenswrapper[4852]: I1210 11:54:47.985003 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-utilities\") pod \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " Dec 10 11:54:47 crc kubenswrapper[4852]: I1210 11:54:47.985126 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-catalog-content\") pod \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " Dec 10 11:54:47 crc kubenswrapper[4852]: I1210 11:54:47.985158 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8vfs\" (UniqueName: \"kubernetes.io/projected/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-kube-api-access-j8vfs\") pod \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\" (UID: \"c4ee2051-d987-4dea-abca-3fcfbdb63ac5\") " Dec 10 11:54:47 crc kubenswrapper[4852]: I1210 11:54:47.986266 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-utilities" (OuterVolumeSpecName: "utilities") pod "c4ee2051-d987-4dea-abca-3fcfbdb63ac5" (UID: "c4ee2051-d987-4dea-abca-3fcfbdb63ac5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:54:47 crc kubenswrapper[4852]: I1210 11:54:47.991024 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-kube-api-access-j8vfs" (OuterVolumeSpecName: "kube-api-access-j8vfs") pod "c4ee2051-d987-4dea-abca-3fcfbdb63ac5" (UID: "c4ee2051-d987-4dea-abca-3fcfbdb63ac5"). InnerVolumeSpecName "kube-api-access-j8vfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.006281 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ee2051-d987-4dea-abca-3fcfbdb63ac5" (UID: "c4ee2051-d987-4dea-abca-3fcfbdb63ac5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.090506 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.090589 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.090609 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8vfs\" (UniqueName: \"kubernetes.io/projected/c4ee2051-d987-4dea-abca-3fcfbdb63ac5-kube-api-access-j8vfs\") on node \"crc\" DevicePath \"\"" Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.697917 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nczw8" event={"ID":"c4ee2051-d987-4dea-abca-3fcfbdb63ac5","Type":"ContainerDied","Data":"e78145b8910729a13503ca87ce7bdeb2c39020f0957de0b6e2869c473edd1eda"} Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.697973 4852 scope.go:117] "RemoveContainer" containerID="4e7c1ad496542058fd056edceef8bd6c52d51c02f8b658ba5fbdc489f3f33024" Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.697975 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nczw8" Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.738705 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nczw8"] Dec 10 11:54:48 crc kubenswrapper[4852]: I1210 11:54:48.744825 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nczw8"] Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.176711 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" path="/var/lib/kubelet/pods/c4ee2051-d987-4dea-abca-3fcfbdb63ac5/volumes" Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.331882 4852 scope.go:117] "RemoveContainer" containerID="5549465fd5846ccc01491e450854222fd19153fac1a641123562fd2b79a9e4bb" Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.371625 4852 scope.go:117] "RemoveContainer" containerID="728e4b041fe8b805f8c87dc83ffe25e435b00375a90badc0feaeb328fb71fe13" Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.488358 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.715759 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2rb2" event={"ID":"4aa25f22-5823-46d9-ae2b-a507642dc0df","Type":"ContainerStarted","Data":"8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d"} Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.722877 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-662t9" event={"ID":"2f614760-033c-494e-81d4-11c997e0db34","Type":"ContainerStarted","Data":"a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef"} Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.729695 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmxbw" event={"ID":"b65cc728-9de3-466f-902b-47f30708118c","Type":"ContainerStarted","Data":"41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4"} Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.732518 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn5tj" event={"ID":"031e1f57-c87c-4d8f-a05a-380efb0979ec","Type":"ContainerStarted","Data":"b8a9ba403f31399fedaa5a98dcfc33c4dd6e5c72badf4910b85f4b875feac281"} Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.737762 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn8dv" event={"ID":"eb61d8f4-66d0-4d11-955f-4984ab5e18e6","Type":"ContainerStarted","Data":"3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1"} Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.740288 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqmsb" event={"ID":"aa4dd551-8252-43a4-b1b3-d4daf088ddd5","Type":"ContainerStarted","Data":"5b9636d606893116b2f4d06610d889aff1cd3d43d5b49c70783fef222c934c13"} Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.743979 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d5jr" event={"ID":"0161f217-65f3-4afe-8037-281871787a8b","Type":"ContainerStarted","Data":"753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7"} Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.786749 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cn5tj" podStartSLOduration=18.886923031 podStartE2EDuration="1m41.786723861s" podCreationTimestamp="2025-12-10 11:53:09 +0000 UTC" firstStartedPulling="2025-12-10 11:53:27.437003189 +0000 UTC m=+93.522528413" lastFinishedPulling="2025-12-10 11:54:50.336804029 +0000 UTC m=+176.422329243" observedRunningTime="2025-12-10 11:54:50.78424598 +0000 UTC m=+176.869771224" watchObservedRunningTime="2025-12-10 11:54:50.786723861 +0000 UTC m=+176.872249085" Dec 10 11:54:50 crc kubenswrapper[4852]: I1210 11:54:50.874242 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6d5jr" podStartSLOduration=18.944996201 podStartE2EDuration="1m41.874192745s" podCreationTimestamp="2025-12-10 11:53:09 +0000 UTC" firstStartedPulling="2025-12-10 11:53:27.436437634 +0000 UTC m=+93.521962858" lastFinishedPulling="2025-12-10 11:54:50.365634178 +0000 UTC m=+176.451159402" observedRunningTime="2025-12-10 11:54:50.871688354 +0000 UTC m=+176.957213608" watchObservedRunningTime="2025-12-10 11:54:50.874192745 +0000 UTC m=+176.959717969" Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.427862 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r7qp9"] Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.780750 4852 generic.go:334] "Generic (PLEG): container finished" podID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerID="3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1" exitCode=0 Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.780811 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn8dv" event={"ID":"eb61d8f4-66d0-4d11-955f-4984ab5e18e6","Type":"ContainerDied","Data":"3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1"} Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.786255 4852 generic.go:334] "Generic (PLEG): container finished" podID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerID="5b9636d606893116b2f4d06610d889aff1cd3d43d5b49c70783fef222c934c13" exitCode=0 Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.786279 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqmsb" event={"ID":"aa4dd551-8252-43a4-b1b3-d4daf088ddd5","Type":"ContainerDied","Data":"5b9636d606893116b2f4d06610d889aff1cd3d43d5b49c70783fef222c934c13"} Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.800422 4852 generic.go:334] "Generic (PLEG): container finished" podID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerID="8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d" exitCode=0 Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.800489 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2rb2" event={"ID":"4aa25f22-5823-46d9-ae2b-a507642dc0df","Type":"ContainerDied","Data":"8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d"} Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.804350 4852 generic.go:334] "Generic (PLEG): container finished" podID="2f614760-033c-494e-81d4-11c997e0db34" containerID="a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef" exitCode=0 Dec 10 11:54:51 crc kubenswrapper[4852]: I1210 11:54:51.804405 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-662t9" event={"ID":"2f614760-033c-494e-81d4-11c997e0db34","Type":"ContainerDied","Data":"a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef"} Dec 10 11:54:52 crc kubenswrapper[4852]: I1210 11:54:52.815174 4852 generic.go:334] "Generic (PLEG): container finished" podID="b65cc728-9de3-466f-902b-47f30708118c" containerID="41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4" exitCode=0 Dec 10 11:54:52 crc kubenswrapper[4852]: I1210 11:54:52.815282 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmxbw" event={"ID":"b65cc728-9de3-466f-902b-47f30708118c","Type":"ContainerDied","Data":"41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4"} Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.263270 4852 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.264810 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="extract-utilities" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.264871 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="extract-utilities" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.264916 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="extract-content" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.264937 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="extract-content" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.264959 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="registry-server" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.264977 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="registry-server" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.265007 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c622b66-ddfa-4f85-b9fe-baaaaed40eea" containerName="pruner" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.265025 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c622b66-ddfa-4f85-b9fe-baaaaed40eea" containerName="pruner" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.265054 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.265072 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.265357 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ee2051-d987-4dea-abca-3fcfbdb63ac5" containerName="registry-server" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.265395 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d331406-40f3-46fa-b660-f4cf0813d332" containerName="kube-multus-additional-cni-plugins" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.265426 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c622b66-ddfa-4f85-b9fe-baaaaed40eea" containerName="pruner" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.266105 4852 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.266168 4852 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.266247 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.266591 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df" gracePeriod=15 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.266624 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77" gracePeriod=15 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.266738 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39" gracePeriod=15 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.266767 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c" gracePeriod=15 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.266769 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221" gracePeriod=15 Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.266860 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267071 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.267085 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267092 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.267104 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267113 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.267119 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267125 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.267133 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267139 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.267146 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267151 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 10 11:54:58 crc kubenswrapper[4852]: E1210 11:54:58.267158 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267165 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267295 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267305 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267316 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267325 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267333 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.267490 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.279991 4852 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.287537 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.287659 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.287692 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.289537 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.289631 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.289725 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.289782 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.289968 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.310885 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391307 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391380 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391408 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391435 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391457 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391484 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391508 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391527 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391598 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391672 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391695 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391720 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391755 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391775 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391797 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.391818 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.608512 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.852086 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.853443 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.854374 4852 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c" exitCode=0 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.854408 4852 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77" exitCode=0 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.854418 4852 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39" exitCode=0 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.854425 4852 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221" exitCode=2 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.854491 4852 scope.go:117] "RemoveContainer" containerID="49ccb60e4792488297df3e91b644c0e72dc40bafe1491f5e2b6421ef25893c86" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.856924 4852 generic.go:334] "Generic (PLEG): container finished" podID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" containerID="fd2ddf289706b283826aacdc3289f66e2d0a0a5367ac0b1c041085e5fa716db4" exitCode=0 Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.856960 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009","Type":"ContainerDied","Data":"fd2ddf289706b283826aacdc3289f66e2d0a0a5367ac0b1c041085e5fa716db4"} Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.857763 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:58 crc kubenswrapper[4852]: I1210 11:54:58.858062 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.701739 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.701815 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.741848 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.742783 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.743295 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.743758 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.873171 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.886904 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.887605 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.927003 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.927624 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.929791 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.930458 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.931906 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.932257 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.932457 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.934073 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:54:59 crc kubenswrapper[4852]: I1210 11:54:59.934598 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: E1210 11:55:00.574119 4852 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: E1210 11:55:00.574577 4852 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: E1210 11:55:00.575008 4852 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: E1210 11:55:00.575649 4852 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: E1210 11:55:00.575965 4852 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: I1210 11:55:00.575994 4852 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 10 11:55:00 crc kubenswrapper[4852]: E1210 11:55:00.576286 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="200ms" Dec 10 11:55:00 crc kubenswrapper[4852]: E1210 11:55:00.777602 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="400ms" Dec 10 11:55:00 crc kubenswrapper[4852]: I1210 11:55:00.919538 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:55:00 crc kubenswrapper[4852]: I1210 11:55:00.920178 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: I1210 11:55:00.920636 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: I1210 11:55:00.920902 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:00 crc kubenswrapper[4852]: I1210 11:55:00.921151 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:01 crc kubenswrapper[4852]: E1210 11:55:01.179477 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="800ms" Dec 10 11:55:01 crc kubenswrapper[4852]: E1210 11:55:01.981531 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="1.6s" Dec 10 11:55:03 crc kubenswrapper[4852]: E1210 11:55:03.582629 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="3.2s" Dec 10 11:55:04 crc kubenswrapper[4852]: I1210 11:55:04.172921 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:04 crc kubenswrapper[4852]: I1210 11:55:04.176775 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:04 crc kubenswrapper[4852]: I1210 11:55:04.177189 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:04 crc kubenswrapper[4852]: I1210 11:55:04.177528 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:06 crc kubenswrapper[4852]: E1210 11:55:06.784210 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="6.4s" Dec 10 11:55:07 crc kubenswrapper[4852]: I1210 11:55:07.917823 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 10 11:55:07 crc kubenswrapper[4852]: I1210 11:55:07.918940 4852 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df" exitCode=0 Dec 10 11:55:10 crc kubenswrapper[4852]: E1210 11:55:10.258961 4852 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.73:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" volumeName="registry-storage" Dec 10 11:55:11 crc kubenswrapper[4852]: E1210 11:55:11.779614 4852 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.73:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-dn8dv.187fd8949e7e3598 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-dn8dv,UID:eb61d8f4-66d0-4d11-955f-4984ab5e18e6,APIVersion:v1,ResourceVersion:28143,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 19.995s (19.995s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-10 11:55:11.778354584 +0000 UTC m=+197.863879808,LastTimestamp:2025-12-10 11:55:11.778354584 +0000 UTC m=+197.863879808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.829377 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.830391 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.830895 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.831152 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.831412 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.896839 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kube-api-access\") pod \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.896953 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kubelet-dir\") pod \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.897051 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-var-lock\") pod \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\" (UID: \"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009\") " Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.897465 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-var-lock" (OuterVolumeSpecName: "var-lock") pod "be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" (UID: "be083c0f-9fb1-4b9a-9b5b-76bbacc4f009"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.897576 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" (UID: "be083c0f-9fb1-4b9a-9b5b-76bbacc4f009"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.902831 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" (UID: "be083c0f-9fb1-4b9a-9b5b-76bbacc4f009"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.938740 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"be083c0f-9fb1-4b9a-9b5b-76bbacc4f009","Type":"ContainerDied","Data":"63ed3d02305699fcc5be12bb0a3c5ab1e7d518a1b1c3b726194d5a435f65ff44"} Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.939102 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63ed3d02305699fcc5be12bb0a3c5ab1e7d518a1b1c3b726194d5a435f65ff44" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.938788 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.956113 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.956900 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.957591 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.958216 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.998126 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.998171 4852 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:11 crc kubenswrapper[4852]: I1210 11:55:11.998180 4852 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be083c0f-9fb1-4b9a-9b5b-76bbacc4f009-var-lock\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:13 crc kubenswrapper[4852]: E1210 11:55:13.185832 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="7s" Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.951148 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.951218 4852 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1" exitCode=1 Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.951278 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1"} Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.951817 4852 scope.go:117] "RemoveContainer" containerID="60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1" Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.952384 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.952680 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.953008 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.953283 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:13 crc kubenswrapper[4852]: I1210 11:55:13.953531 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.173829 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.174465 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.174667 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.175005 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.175456 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: W1210 11:55:14.864751 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6784f4213cac8fa12067f564d6b27c73bfd5b66b012ad6806eaefb7b3f7b3674 WatchSource:0}: Error finding container 6784f4213cac8fa12067f564d6b27c73bfd5b66b012ad6806eaefb7b3f7b3674: Status 404 returned error can't find the container with id 6784f4213cac8fa12067f564d6b27c73bfd5b66b012ad6806eaefb7b3f7b3674 Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.890901 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.891750 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.892380 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.892649 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.893021 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.894401 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.895062 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.895603 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.940860 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.940942 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.941039 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.941071 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.941258 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.941023 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.941465 4852 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.941488 4852 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.975984 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.977002 4852 scope.go:117] "RemoveContainer" containerID="bf83a18ea63a7834fedae56f0a92895ade4de1ab4502727033ff60444db4744c" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.977091 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.978166 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6784f4213cac8fa12067f564d6b27c73bfd5b66b012ad6806eaefb7b3f7b3674"} Dec 10 11:55:14 crc kubenswrapper[4852]: I1210 11:55:14.993958 4852 scope.go:117] "RemoveContainer" containerID="9d10e9811c43b787065a876e65cad9d35819f6479ed711cfff8436bed291ee77" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.025770 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.026322 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.027029 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.027312 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.027532 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.027728 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.075804 4852 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.079916 4852 scope.go:117] "RemoveContainer" containerID="f1d7394b650c225c2f1ff9dc8d6f106eb5fcb971e9ed348f7cda490f6ccf8a39" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.106513 4852 scope.go:117] "RemoveContainer" containerID="50a1ad9590fe87401aa747a7fdbf95bea2f41243b174e5e921ddc426873c8221" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.134329 4852 scope.go:117] "RemoveContainer" containerID="aafde7d3ae8a2eb06c3fea7ad722779458682b3b130caae168e7a40a9cd010df" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.161322 4852 scope.go:117] "RemoveContainer" containerID="11120382dd599b63219c91cd1206a0050ee3623ebeeeebfb3924a5d8f8b4b82c" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.790880 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.790972 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.989044 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn8dv" event={"ID":"eb61d8f4-66d0-4d11-955f-4984ab5e18e6","Type":"ContainerStarted","Data":"a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7"} Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.990297 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.990717 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.991264 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.991652 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.992145 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.992384 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.992566 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqmsb" event={"ID":"aa4dd551-8252-43a4-b1b3-d4daf088ddd5","Type":"ContainerStarted","Data":"ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f"} Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.992640 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.992984 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.993332 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.993746 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.994008 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.994297 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.994634 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.994732 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a"} Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.994830 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.995019 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.995407 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.995683 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.995934 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.996352 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.997119 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.997554 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2rb2" event={"ID":"4aa25f22-5823-46d9-ae2b-a507642dc0df","Type":"ContainerStarted","Data":"a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe"} Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.997638 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.997902 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.998123 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.998451 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.998740 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.999002 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.999216 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:15 crc kubenswrapper[4852]: I1210 11:55:15.999462 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.003397 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.003829 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.004155 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.004576 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.005555 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.005696 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"05106b24d0cbf39a951f524d03c1b2ba5f589433d0354e532885f6662d8b59ad"} Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.006927 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.007270 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.007758 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.008111 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.008592 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.008686 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-662t9" event={"ID":"2f614760-033c-494e-81d4-11c997e0db34","Type":"ContainerStarted","Data":"d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c"} Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.008939 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.009153 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.009400 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.009620 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.009895 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.010089 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.010334 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.010653 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.010907 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.011207 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.011423 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.011688 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.011934 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.012175 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.013505 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmxbw" event={"ID":"b65cc728-9de3-466f-902b-47f30708118c","Type":"ContainerStarted","Data":"7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56"} Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.014006 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.014471 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.014765 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.015041 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.015281 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.015528 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.015783 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.016046 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.016333 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.016652 4852 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.016929 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.187659 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.342590 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.342664 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.486354 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.486434 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.507916 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" containerName="oauth-openshift" containerID="cri-o://48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6" gracePeriod=15 Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.878738 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.879365 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.915544 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.916592 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.921357 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.922147 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.922940 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.923641 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.924988 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.925452 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.925839 4852 status_manager.go:851] "Failed to get status for pod" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r7qp9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.926193 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.926578 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:16 crc kubenswrapper[4852]: I1210 11:55:16.926842 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.004648 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78h9t\" (UniqueName: \"kubernetes.io/projected/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-kube-api-access-78h9t\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.005521 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-idp-0-file-data\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.005670 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-cliconfig\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.005796 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-dir\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.005903 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-service-ca\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006006 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-login\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006150 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-ocp-branding-template\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006299 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-router-certs\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006425 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-trusted-ca-bundle\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006549 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-error\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006645 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-policies\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006735 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-serving-cert\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006840 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-provider-selection\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006989 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-session\") pod \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\" (UID: \"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d\") " Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.005921 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006394 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.006563 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.007317 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.008392 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.008437 4852 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.008456 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.008473 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.008880 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.012365 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-kube-api-access-78h9t" (OuterVolumeSpecName: "kube-api-access-78h9t") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "kube-api-access-78h9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.013492 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.014584 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.014957 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.015601 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.018660 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.018911 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.019688 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.019881 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" (UID: "213e0da5-7024-4329-b4cb-ae2fa8fe1f0d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.020937 4852 generic.go:334] "Generic (PLEG): container finished" podID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" containerID="48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6" exitCode=0 Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.021018 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.021082 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" event={"ID":"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d","Type":"ContainerDied","Data":"48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6"} Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.021117 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" event={"ID":"213e0da5-7024-4329-b4cb-ae2fa8fe1f0d","Type":"ContainerDied","Data":"187e7fad110a59cd05210e2f117d4590435afdd1e26c2cc88e16e34258dfcabe"} Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.021142 4852 scope.go:117] "RemoveContainer" containerID="48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.022058 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.022252 4852 status_manager.go:851] "Failed to get status for pod" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r7qp9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.022448 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.022595 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.022799 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.023595 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.024303 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.024630 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.027383 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.027852 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.029891 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.068725 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.069077 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.069587 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.070136 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.070461 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.070802 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.071141 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.071497 4852 status_manager.go:851] "Failed to get status for pod" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r7qp9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.071785 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.072068 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.072367 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.076754 4852 scope.go:117] "RemoveContainer" containerID="48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6" Dec 10 11:55:17 crc kubenswrapper[4852]: E1210 11:55:17.077515 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6\": container with ID starting with 48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6 not found: ID does not exist" containerID="48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.077561 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6"} err="failed to get container status \"48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6\": rpc error: code = NotFound desc = could not find container \"48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6\": container with ID starting with 48b9ac745ec07ece156fbd4831bf78367cedeaecdbd01778c02d36136fd9f3c6 not found: ID does not exist" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.110379 4852 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.110673 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.110750 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.110816 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.110874 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78h9t\" (UniqueName: \"kubernetes.io/projected/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-kube-api-access-78h9t\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.110934 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.110995 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.111056 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.111117 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.111188 4852 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.339712 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.339983 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.384742 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.385314 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.385688 4852 status_manager.go:851] "Failed to get status for pod" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r7qp9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.386078 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.386343 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.386641 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.386926 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.387136 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.387412 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.387678 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.387904 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.388209 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.393604 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-662t9" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="registry-server" probeResult="failure" output=< Dec 10 11:55:17 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 11:55:17 crc kubenswrapper[4852]: > Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.538922 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xmxbw" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="registry-server" probeResult="failure" output=< Dec 10 11:55:17 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 11:55:17 crc kubenswrapper[4852]: > Dec 10 11:55:17 crc kubenswrapper[4852]: I1210 11:55:17.932016 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-h2rb2" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="registry-server" probeResult="failure" output=< Dec 10 11:55:17 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 11:55:17 crc kubenswrapper[4852]: > Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.275038 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.275097 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.316563 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.317140 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.317408 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.317676 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.317992 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.318312 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.318738 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.319057 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.319281 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.319842 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.320118 4852 status_manager.go:851] "Failed to get status for pod" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r7qp9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:18 crc kubenswrapper[4852]: I1210 11:55:18.320490 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:19 crc kubenswrapper[4852]: E1210 11:55:19.207134 4852 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.73:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-dn8dv.187fd8949e7e3598 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-dn8dv,UID:eb61d8f4-66d0-4d11-955f-4984ab5e18e6,APIVersion:v1,ResourceVersion:28143,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 19.995s (19.995s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-10 11:55:11.778354584 +0000 UTC m=+197.863879808,LastTimestamp:2025-12-10 11:55:11.778354584 +0000 UTC m=+197.863879808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 10 11:55:19 crc kubenswrapper[4852]: I1210 11:55:19.600921 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:55:19 crc kubenswrapper[4852]: I1210 11:55:19.601005 4852 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 10 11:55:19 crc kubenswrapper[4852]: I1210 11:55:19.601040 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 10 11:55:20 crc kubenswrapper[4852]: E1210 11:55:20.186898 4852 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" interval="7s" Dec 10 11:55:20 crc kubenswrapper[4852]: I1210 11:55:20.240488 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:55:22 crc kubenswrapper[4852]: E1210 11:55:22.043348 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:55:22Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:55:22Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:55:22Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-10T11:55:22Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:789c4ba609078742a4780cfc8f8931c5c262a99dcfa8c1ce6894abd49ea2ec0c\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae6c14b814ac7e185e03307bea69b261e941d241ce839e9871994c5619a1d59d\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1626187079},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:98cd56e57d8c89e59c8ac0d99815cb93378bf6a147e8daaf50bb24e704e676ab\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b5b85200d1f34b104b43b44b4ef97ebfc425b72cc1cfaf11c46bb3fb7e0f528a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1216426831},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:26ea35413bef0c078547a03dc093f1d4f15a7d9fc91f05c687b1a437352f0856\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ba6f8ed6f9b58e63d979cc9729f5ce2c6f22e99b35ce91b902e198eb78d6b106\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201960779},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1412aa5a552c366a3db6d77ed2b66514a04c3de87179b9eb31c42e6e1dfff68e\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:c888c734987a3d160e171aeebeac8b44b9056240fa354958f260a29e70f3d4b7\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1142487363},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: E1210 11:55:22.043903 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: E1210 11:55:22.044332 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: E1210 11:55:22.044527 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: E1210 11:55:22.044781 4852 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: E1210 11:55:22.044797 4852 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.170189 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.171591 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.185741 4852 status_manager.go:851] "Failed to get status for pod" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r7qp9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.187104 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.187509 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.188106 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.189441 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.189483 4852 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.189507 4852 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.189899 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: E1210 11:55:22.189967 4852 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.190730 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.191100 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.191521 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.192353 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:22 crc kubenswrapper[4852]: I1210 11:55:22.192840 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:23 crc kubenswrapper[4852]: I1210 11:55:23.060337 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85032e7778e96a91f7981b6a704328535aaf46629071763c74e79c94393010c8"} Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.069495 4852 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d310821c7ded02333a85692a4deec02b699fb9fa19e76f584c97a51abe2e08bb" exitCode=0 Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.069572 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d310821c7ded02333a85692a4deec02b699fb9fa19e76f584c97a51abe2e08bb"} Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.070115 4852 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.070176 4852 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.070754 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: E1210 11:55:24.070958 4852 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.071353 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.071718 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.072075 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.073299 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.073643 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.073976 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.074215 4852 status_manager.go:851] "Failed to get status for pod" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r7qp9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.074552 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.074886 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.075187 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.178039 4852 status_manager.go:851] "Failed to get status for pod" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" pod="openshift-marketplace/redhat-operators-cn5tj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cn5tj\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.178858 4852 status_manager.go:851] "Failed to get status for pod" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" pod="openshift-marketplace/community-operators-h2rb2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-h2rb2\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.179328 4852 status_manager.go:851] "Failed to get status for pod" podUID="2f614760-033c-494e-81d4-11c997e0db34" pod="openshift-marketplace/certified-operators-662t9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-662t9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.179868 4852 status_manager.go:851] "Failed to get status for pod" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" pod="openshift-marketplace/certified-operators-dn8dv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-dn8dv\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.180308 4852 status_manager.go:851] "Failed to get status for pod" podUID="b65cc728-9de3-466f-902b-47f30708118c" pod="openshift-marketplace/community-operators-xmxbw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-xmxbw\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.180638 4852 status_manager.go:851] "Failed to get status for pod" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" pod="openshift-authentication/oauth-openshift-558db77b4-r7qp9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-r7qp9\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.180974 4852 status_manager.go:851] "Failed to get status for pod" podUID="0161f217-65f3-4afe-8037-281871787a8b" pod="openshift-marketplace/redhat-operators-6d5jr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-6d5jr\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.181556 4852 status_manager.go:851] "Failed to get status for pod" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" pod="openshift-marketplace/redhat-marketplace-mqmsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-mqmsb\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.182521 4852 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.182832 4852 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.183209 4852 status_manager.go:851] "Failed to get status for pod" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:24 crc kubenswrapper[4852]: I1210 11:55:24.183550 4852 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.73:6443: connect: connection refused" Dec 10 11:55:25 crc kubenswrapper[4852]: I1210 11:55:25.082078 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fade9be2511db568ef5a722f2ff1ea5652076c1cceaaba412d627a4722773d51"} Dec 10 11:55:25 crc kubenswrapper[4852]: I1210 11:55:25.082552 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"efc796ab0b66a898d86e070ca1d6f2309f5f8547ae757fcdc5bd569f914d5fd7"} Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.106197 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c33ec1e61ad42422164926a8823718fdfd603632a7de3b1ab9a80acaa72e2193"} Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.106266 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d5e124ab5ce68c020772fa2cf88d1c46ac2966461978b84e3e1b7feed7542252"} Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.106280 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e72df91723aa44be8200da5ebde2b131d6bac736b3b1e34fd214b2e36352a49d"} Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.106436 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.106719 4852 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.106753 4852 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.383690 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.421720 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.540709 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.578219 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:55:26 crc kubenswrapper[4852]: I1210 11:55:26.963123 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:55:27 crc kubenswrapper[4852]: I1210 11:55:27.012919 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:55:27 crc kubenswrapper[4852]: I1210 11:55:27.191169 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:27 crc kubenswrapper[4852]: I1210 11:55:27.191344 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:27 crc kubenswrapper[4852]: I1210 11:55:27.197436 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:27 crc kubenswrapper[4852]: I1210 11:55:27.382623 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:55:28 crc kubenswrapper[4852]: I1210 11:55:28.323984 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:55:29 crc kubenswrapper[4852]: I1210 11:55:29.601523 4852 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 10 11:55:29 crc kubenswrapper[4852]: I1210 11:55:29.602094 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 10 11:55:31 crc kubenswrapper[4852]: I1210 11:55:31.122181 4852 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:31 crc kubenswrapper[4852]: I1210 11:55:31.197548 4852 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="02130f77-4aed-4ceb-bf6f-28c0eca007dc" Dec 10 11:55:32 crc kubenswrapper[4852]: I1210 11:55:32.139737 4852 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:32 crc kubenswrapper[4852]: I1210 11:55:32.139770 4852 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:32 crc kubenswrapper[4852]: I1210 11:55:32.143720 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:55:32 crc kubenswrapper[4852]: I1210 11:55:32.143810 4852 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="02130f77-4aed-4ceb-bf6f-28c0eca007dc" Dec 10 11:55:33 crc kubenswrapper[4852]: I1210 11:55:33.144706 4852 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:33 crc kubenswrapper[4852]: I1210 11:55:33.144748 4852 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:55:33 crc kubenswrapper[4852]: I1210 11:55:33.148917 4852 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="02130f77-4aed-4ceb-bf6f-28c0eca007dc" Dec 10 11:55:39 crc kubenswrapper[4852]: I1210 11:55:39.600924 4852 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 10 11:55:39 crc kubenswrapper[4852]: I1210 11:55:39.601486 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 10 11:55:39 crc kubenswrapper[4852]: I1210 11:55:39.601579 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:55:39 crc kubenswrapper[4852]: I1210 11:55:39.603211 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"05106b24d0cbf39a951f524d03c1b2ba5f589433d0354e532885f6662d8b59ad"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 10 11:55:39 crc kubenswrapper[4852]: I1210 11:55:39.603357 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://05106b24d0cbf39a951f524d03c1b2ba5f589433d0354e532885f6662d8b59ad" gracePeriod=30 Dec 10 11:55:45 crc kubenswrapper[4852]: I1210 11:55:45.790498 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 11:55:45 crc kubenswrapper[4852]: I1210 11:55:45.790875 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 11:55:45 crc kubenswrapper[4852]: I1210 11:55:45.790936 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:55:45 crc kubenswrapper[4852]: I1210 11:55:45.791777 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df5387583f66f93b26a76954748f69c02df08bb9c349c9c9465ec2fb73fa4fd0"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 11:55:45 crc kubenswrapper[4852]: I1210 11:55:45.791834 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://df5387583f66f93b26a76954748f69c02df08bb9c349c9c9465ec2fb73fa4fd0" gracePeriod=600 Dec 10 11:55:46 crc kubenswrapper[4852]: I1210 11:55:46.218217 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="df5387583f66f93b26a76954748f69c02df08bb9c349c9c9465ec2fb73fa4fd0" exitCode=0 Dec 10 11:55:46 crc kubenswrapper[4852]: I1210 11:55:46.218316 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"df5387583f66f93b26a76954748f69c02df08bb9c349c9c9465ec2fb73fa4fd0"} Dec 10 11:55:46 crc kubenswrapper[4852]: I1210 11:55:46.218687 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"81155a692a32264fc4c153e9736724fd23b0a15aa61c032f0b091089d2a44202"} Dec 10 11:55:57 crc kubenswrapper[4852]: I1210 11:55:57.462941 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 10 11:55:57 crc kubenswrapper[4852]: I1210 11:55:57.792745 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 10 11:55:57 crc kubenswrapper[4852]: I1210 11:55:57.972358 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 10 11:55:58 crc kubenswrapper[4852]: I1210 11:55:58.783607 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 10 11:55:58 crc kubenswrapper[4852]: I1210 11:55:58.837064 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 10 11:55:59 crc kubenswrapper[4852]: I1210 11:55:59.932513 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 10 11:55:59 crc kubenswrapper[4852]: I1210 11:55:59.943828 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 10 11:56:00 crc kubenswrapper[4852]: I1210 11:56:00.698366 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 10 11:56:01 crc kubenswrapper[4852]: I1210 11:56:01.474128 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 10 11:56:02 crc kubenswrapper[4852]: I1210 11:56:02.122343 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 10 11:56:02 crc kubenswrapper[4852]: I1210 11:56:02.204593 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 10 11:56:02 crc kubenswrapper[4852]: I1210 11:56:02.359660 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 10 11:56:02 crc kubenswrapper[4852]: I1210 11:56:02.751483 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 10 11:56:03 crc kubenswrapper[4852]: I1210 11:56:03.155000 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 10 11:56:03 crc kubenswrapper[4852]: I1210 11:56:03.531904 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 10 11:56:03 crc kubenswrapper[4852]: I1210 11:56:03.638393 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 10 11:56:03 crc kubenswrapper[4852]: I1210 11:56:03.839521 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 10 11:56:03 crc kubenswrapper[4852]: I1210 11:56:03.990532 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 10 11:56:04 crc kubenswrapper[4852]: I1210 11:56:04.337900 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 10 11:56:04 crc kubenswrapper[4852]: I1210 11:56:04.379416 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 10 11:56:04 crc kubenswrapper[4852]: I1210 11:56:04.521550 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 10 11:56:04 crc kubenswrapper[4852]: I1210 11:56:04.769317 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 10 11:56:05 crc kubenswrapper[4852]: I1210 11:56:05.119202 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 10 11:56:05 crc kubenswrapper[4852]: I1210 11:56:05.247958 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 10 11:56:06 crc kubenswrapper[4852]: I1210 11:56:06.302474 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 10 11:56:06 crc kubenswrapper[4852]: I1210 11:56:06.551853 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 10 11:56:06 crc kubenswrapper[4852]: I1210 11:56:06.579609 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 10 11:56:06 crc kubenswrapper[4852]: I1210 11:56:06.709886 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 10 11:56:06 crc kubenswrapper[4852]: I1210 11:56:06.716414 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 10 11:56:06 crc kubenswrapper[4852]: I1210 11:56:06.954733 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.023300 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.025211 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.169201 4852 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.232808 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.344730 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.575919 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.576168 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.654554 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.721657 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 10 11:56:07 crc kubenswrapper[4852]: I1210 11:56:07.824699 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 10 11:56:08 crc kubenswrapper[4852]: I1210 11:56:08.220832 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 10 11:56:08 crc kubenswrapper[4852]: I1210 11:56:08.224784 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 10 11:56:08 crc kubenswrapper[4852]: I1210 11:56:08.474506 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 10 11:56:08 crc kubenswrapper[4852]: I1210 11:56:08.511813 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 10 11:56:08 crc kubenswrapper[4852]: I1210 11:56:08.598787 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.023760 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.236541 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.322742 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.357283 4852 generic.go:334] "Generic (PLEG): container finished" podID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerID="e697d3d8c05addb7c0d0f2ab7820da2aa002d00d4aa5f7cccae8fee93f733f5a" exitCode=0 Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.357420 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" event={"ID":"1919e18e-d914-4ee7-8bf4-6de02e6760c2","Type":"ContainerDied","Data":"e697d3d8c05addb7c0d0f2ab7820da2aa002d00d4aa5f7cccae8fee93f733f5a"} Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.358246 4852 scope.go:117] "RemoveContainer" containerID="e697d3d8c05addb7c0d0f2ab7820da2aa002d00d4aa5f7cccae8fee93f733f5a" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.629986 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.635737 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.693089 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.703192 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.775962 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.806710 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.910120 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 10 11:56:09 crc kubenswrapper[4852]: I1210 11:56:09.967287 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.049496 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.362937 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4cv5l_1919e18e-d914-4ee7-8bf4-6de02e6760c2/marketplace-operator/1.log" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.363037 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.364112 4852 generic.go:334] "Generic (PLEG): container finished" podID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerID="f466e917954b90e012d340ea25b8b410327d3d7eee4f3803597e3074b432fdc5" exitCode=1 Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.364167 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" event={"ID":"1919e18e-d914-4ee7-8bf4-6de02e6760c2","Type":"ContainerDied","Data":"f466e917954b90e012d340ea25b8b410327d3d7eee4f3803597e3074b432fdc5"} Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.364212 4852 scope.go:117] "RemoveContainer" containerID="e697d3d8c05addb7c0d0f2ab7820da2aa002d00d4aa5f7cccae8fee93f733f5a" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.366360 4852 scope.go:117] "RemoveContainer" containerID="f466e917954b90e012d340ea25b8b410327d3d7eee4f3803597e3074b432fdc5" Dec 10 11:56:10 crc kubenswrapper[4852]: E1210 11:56:10.366682 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-4cv5l_openshift-marketplace(1919e18e-d914-4ee7-8bf4-6de02e6760c2)\"" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.367348 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.374172 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.374249 4852 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="05106b24d0cbf39a951f524d03c1b2ba5f589433d0354e532885f6662d8b59ad" exitCode=137 Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.374287 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"05106b24d0cbf39a951f524d03c1b2ba5f589433d0354e532885f6662d8b59ad"} Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.374320 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1923f35c04a0e3347b90185910a7dfc954aa13a0116c32bb6320ca2601fc6b80"} Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.396561 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.398539 4852 scope.go:117] "RemoveContainer" containerID="60da28f7b81b5df1297da2587aadd572ee9a19886479e808ebac62468bf55cf1" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.414618 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.491220 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.502060 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.550675 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.578898 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.612499 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.690922 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.732520 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.771951 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.837480 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.906396 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 10 11:56:10 crc kubenswrapper[4852]: I1210 11:56:10.975332 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.259801 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.334761 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.336144 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.387263 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4cv5l_1919e18e-d914-4ee7-8bf4-6de02e6760c2/marketplace-operator/1.log" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.391984 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.393533 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.746077 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.763986 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 10 11:56:11 crc kubenswrapper[4852]: I1210 11:56:11.808402 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 10 11:56:12 crc kubenswrapper[4852]: I1210 11:56:12.047751 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 10 11:56:12 crc kubenswrapper[4852]: I1210 11:56:12.068958 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 10 11:56:12 crc kubenswrapper[4852]: I1210 11:56:12.207146 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 10 11:56:12 crc kubenswrapper[4852]: I1210 11:56:12.544877 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 10 11:56:12 crc kubenswrapper[4852]: I1210 11:56:12.920932 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 10 11:56:13 crc kubenswrapper[4852]: I1210 11:56:13.401024 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 10 11:56:13 crc kubenswrapper[4852]: I1210 11:56:13.404788 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 10 11:56:13 crc kubenswrapper[4852]: I1210 11:56:13.472215 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 10 11:56:13 crc kubenswrapper[4852]: I1210 11:56:13.565493 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 10 11:56:13 crc kubenswrapper[4852]: I1210 11:56:13.597792 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 10 11:56:14 crc kubenswrapper[4852]: I1210 11:56:14.037267 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 10 11:56:14 crc kubenswrapper[4852]: I1210 11:56:14.265908 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 10 11:56:14 crc kubenswrapper[4852]: I1210 11:56:14.456879 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 10 11:56:14 crc kubenswrapper[4852]: I1210 11:56:14.625281 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 10 11:56:14 crc kubenswrapper[4852]: I1210 11:56:14.793950 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 10 11:56:14 crc kubenswrapper[4852]: I1210 11:56:14.824963 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 10 11:56:14 crc kubenswrapper[4852]: I1210 11:56:14.827140 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 10 11:56:14 crc kubenswrapper[4852]: I1210 11:56:14.879722 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.104653 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.105796 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.158824 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.265086 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.305986 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.363940 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.377926 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.489844 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.621992 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.690726 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.769637 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 10 11:56:15 crc kubenswrapper[4852]: I1210 11:56:15.885132 4852 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.071337 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.077274 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.313138 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.333053 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.369630 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.374928 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.402128 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.495725 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.556330 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.651856 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.694072 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.752343 4852 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 10 11:56:16 crc kubenswrapper[4852]: I1210 11:56:16.994249 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 10 11:56:17 crc kubenswrapper[4852]: I1210 11:56:17.036966 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 10 11:56:17 crc kubenswrapper[4852]: I1210 11:56:17.064204 4852 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 10 11:56:17 crc kubenswrapper[4852]: I1210 11:56:17.067416 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 10 11:56:17 crc kubenswrapper[4852]: I1210 11:56:17.078171 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 10 11:56:17 crc kubenswrapper[4852]: I1210 11:56:17.123827 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 10 11:56:17 crc kubenswrapper[4852]: I1210 11:56:17.202763 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 10 11:56:17 crc kubenswrapper[4852]: I1210 11:56:17.543721 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 10 11:56:17 crc kubenswrapper[4852]: I1210 11:56:17.983538 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.086815 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.194954 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.291768 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.425187 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.455210 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.455290 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.455983 4852 scope.go:117] "RemoveContainer" containerID="f466e917954b90e012d340ea25b8b410327d3d7eee4f3803597e3074b432fdc5" Dec 10 11:56:18 crc kubenswrapper[4852]: E1210 11:56:18.456273 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-4cv5l_openshift-marketplace(1919e18e-d914-4ee7-8bf4-6de02e6760c2)\"" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.617658 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.712952 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.754184 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.800127 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.846915 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.928040 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 10 11:56:18 crc kubenswrapper[4852]: I1210 11:56:18.994205 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.002899 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.073272 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.080779 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.243169 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.479023 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.499805 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.549496 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.578407 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.600833 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.604849 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.669011 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.820440 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 10 11:56:19 crc kubenswrapper[4852]: I1210 11:56:19.857592 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.002700 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.018366 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.153423 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.240692 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.245631 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.277252 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.385952 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.484542 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.656722 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.667854 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.733604 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.953699 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.982547 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 10 11:56:20 crc kubenswrapper[4852]: I1210 11:56:20.990204 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.022052 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.221510 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.325346 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.485795 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.493281 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.500505 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.686459 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.776926 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.802960 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.815693 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 10 11:56:21 crc kubenswrapper[4852]: I1210 11:56:21.974863 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 10 11:56:22 crc kubenswrapper[4852]: I1210 11:56:22.207445 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 10 11:56:22 crc kubenswrapper[4852]: I1210 11:56:22.401983 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 10 11:56:22 crc kubenswrapper[4852]: I1210 11:56:22.455973 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 10 11:56:22 crc kubenswrapper[4852]: I1210 11:56:22.496848 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 10 11:56:22 crc kubenswrapper[4852]: I1210 11:56:22.651791 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 10 11:56:22 crc kubenswrapper[4852]: I1210 11:56:22.809390 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 10 11:56:23 crc kubenswrapper[4852]: I1210 11:56:23.022146 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 10 11:56:23 crc kubenswrapper[4852]: I1210 11:56:23.253060 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 10 11:56:23 crc kubenswrapper[4852]: I1210 11:56:23.417006 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 10 11:56:23 crc kubenswrapper[4852]: I1210 11:56:23.519528 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 10 11:56:23 crc kubenswrapper[4852]: I1210 11:56:23.543085 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 10 11:56:23 crc kubenswrapper[4852]: I1210 11:56:23.566893 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 10 11:56:23 crc kubenswrapper[4852]: I1210 11:56:23.667115 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 10 11:56:23 crc kubenswrapper[4852]: I1210 11:56:23.765788 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.008967 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.034670 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.091577 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.272176 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.310484 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.347577 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.615128 4852 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.616085 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h2rb2" podStartSLOduration=72.867963411 podStartE2EDuration="3m18.616051296s" podCreationTimestamp="2025-12-10 11:53:06 +0000 UTC" firstStartedPulling="2025-12-10 11:53:09.121286603 +0000 UTC m=+75.206811827" lastFinishedPulling="2025-12-10 11:55:14.869374488 +0000 UTC m=+200.954899712" observedRunningTime="2025-12-10 11:55:31.266730015 +0000 UTC m=+217.352255239" watchObservedRunningTime="2025-12-10 11:56:24.616051296 +0000 UTC m=+270.701576520" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.617031 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-662t9" podStartSLOduration=73.967198943 podStartE2EDuration="3m19.617023453s" podCreationTimestamp="2025-12-10 11:53:05 +0000 UTC" firstStartedPulling="2025-12-10 11:53:09.214565562 +0000 UTC m=+75.300090786" lastFinishedPulling="2025-12-10 11:55:14.864390072 +0000 UTC m=+200.949915296" observedRunningTime="2025-12-10 11:55:31.284913028 +0000 UTC m=+217.370438252" watchObservedRunningTime="2025-12-10 11:56:24.617023453 +0000 UTC m=+270.702548677" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.619562 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xmxbw" podStartSLOduration=73.054412563 podStartE2EDuration="3m18.619553543s" podCreationTimestamp="2025-12-10 11:53:06 +0000 UTC" firstStartedPulling="2025-12-10 11:53:09.272359603 +0000 UTC m=+75.357884827" lastFinishedPulling="2025-12-10 11:55:14.837500583 +0000 UTC m=+200.923025807" observedRunningTime="2025-12-10 11:55:31.31357797 +0000 UTC m=+217.399103204" watchObservedRunningTime="2025-12-10 11:56:24.619553543 +0000 UTC m=+270.705078767" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.619689 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mqmsb" podStartSLOduration=90.177588036 podStartE2EDuration="3m17.619684417s" podCreationTimestamp="2025-12-10 11:53:07 +0000 UTC" firstStartedPulling="2025-12-10 11:53:27.435752306 +0000 UTC m=+93.521277530" lastFinishedPulling="2025-12-10 11:55:14.877848697 +0000 UTC m=+200.963373911" observedRunningTime="2025-12-10 11:55:31.182322867 +0000 UTC m=+217.267848111" watchObservedRunningTime="2025-12-10 11:56:24.619684417 +0000 UTC m=+270.705209641" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.621807 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=86.621796465 podStartE2EDuration="1m26.621796465s" podCreationTimestamp="2025-12-10 11:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:55:31.194127023 +0000 UTC m=+217.279652247" watchObservedRunningTime="2025-12-10 11:56:24.621796465 +0000 UTC m=+270.707321689" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.621900 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dn8dv" podStartSLOduration=77.120966152 podStartE2EDuration="3m18.621893608s" podCreationTimestamp="2025-12-10 11:53:06 +0000 UTC" firstStartedPulling="2025-12-10 11:53:10.277410658 +0000 UTC m=+76.362935882" lastFinishedPulling="2025-12-10 11:55:11.778338114 +0000 UTC m=+197.863863338" observedRunningTime="2025-12-10 11:55:31.299018892 +0000 UTC m=+217.384544126" watchObservedRunningTime="2025-12-10 11:56:24.621893608 +0000 UTC m=+270.707418862" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.622871 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-r7qp9"] Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.623075 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c"] Dec 10 11:56:24 crc kubenswrapper[4852]: E1210 11:56:24.623461 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" containerName="oauth-openshift" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.623489 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" containerName="oauth-openshift" Dec 10 11:56:24 crc kubenswrapper[4852]: E1210 11:56:24.623514 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" containerName="installer" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.623525 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" containerName="installer" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.623593 4852 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.623631 4852 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e1400578-050a-4290-92f9-5db657016b2d" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.623700 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" containerName="oauth-openshift" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.623744 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="be083c0f-9fb1-4b9a-9b5b-76bbacc4f009" containerName="installer" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.624529 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.632390 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.632520 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.632748 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.633712 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.633817 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.634029 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.634662 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.635216 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.636581 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.636601 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.636666 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.637094 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.637640 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.643819 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.653409 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.653464 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.653472 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.692502 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=53.692467216 podStartE2EDuration="53.692467216s" podCreationTimestamp="2025-12-10 11:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:56:24.673631433 +0000 UTC m=+270.759156667" watchObservedRunningTime="2025-12-10 11:56:24.692467216 +0000 UTC m=+270.777992440" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.725334 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.787837 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793460 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793505 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-login\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793527 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793547 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793692 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-session\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793755 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793798 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-audit-policies\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793857 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-error\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.793890 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.794074 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.794203 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.794316 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/460797d9-846a-453d-819d-a63b74ae55e4-audit-dir\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.794356 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq99r\" (UniqueName: \"kubernetes.io/projected/460797d9-846a-453d-819d-a63b74ae55e4-kube-api-access-xq99r\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.794405 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896336 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896413 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-audit-policies\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896445 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-error\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896465 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896502 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896540 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896576 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/460797d9-846a-453d-819d-a63b74ae55e4-audit-dir\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896603 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq99r\" (UniqueName: \"kubernetes.io/projected/460797d9-846a-453d-819d-a63b74ae55e4-kube-api-access-xq99r\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896642 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896673 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896705 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-login\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896732 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896772 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.896802 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-session\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.897569 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-audit-policies\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.898276 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/460797d9-846a-453d-819d-a63b74ae55e4-audit-dir\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.900278 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.900815 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.900810 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.905611 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.906037 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-error\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.906037 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.906598 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.907073 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-session\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.907202 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-user-template-login\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.907641 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.916223 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/460797d9-846a-453d-819d-a63b74ae55e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.917368 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq99r\" (UniqueName: \"kubernetes.io/projected/460797d9-846a-453d-819d-a63b74ae55e4-kube-api-access-xq99r\") pod \"oauth-openshift-cc6d8cb6b-74b6c\" (UID: \"460797d9-846a-453d-819d-a63b74ae55e4\") " pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.946791 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.951966 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:24 crc kubenswrapper[4852]: I1210 11:56:24.978081 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.030082 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.164176 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c"] Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.173887 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.250114 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.250390 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.379267 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.488673 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" event={"ID":"460797d9-846a-453d-819d-a63b74ae55e4","Type":"ContainerStarted","Data":"b3723134984310ed429072ea71c99095b70852dab071d8a5f75412dfea797d01"} Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.488747 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" event={"ID":"460797d9-846a-453d-819d-a63b74ae55e4","Type":"ContainerStarted","Data":"2c5d10f44949df7939c9aaf3c56223090065a39a7df3aa6ff013147e30a62544"} Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.489352 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.490902 4852 patch_prober.go:28] interesting pod/oauth-openshift-cc6d8cb6b-74b6c container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.490959 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" podUID="460797d9-846a-453d-819d-a63b74ae55e4" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.510964 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" podStartSLOduration=94.510940291 podStartE2EDuration="1m34.510940291s" podCreationTimestamp="2025-12-10 11:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:56:25.510093808 +0000 UTC m=+271.595619032" watchObservedRunningTime="2025-12-10 11:56:25.510940291 +0000 UTC m=+271.596465515" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.725118 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.732184 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.843265 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 10 11:56:25 crc kubenswrapper[4852]: I1210 11:56:25.895266 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.179666 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="213e0da5-7024-4329-b4cb-ae2fa8fe1f0d" path="/var/lib/kubelet/pods/213e0da5-7024-4329-b4cb-ae2fa8fe1f0d/volumes" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.341090 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.461483 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.494951 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.495485 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cc6d8cb6b-74b6c_460797d9-846a-453d-819d-a63b74ae55e4/oauth-openshift/0.log" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.495554 4852 generic.go:334] "Generic (PLEG): container finished" podID="460797d9-846a-453d-819d-a63b74ae55e4" containerID="b3723134984310ed429072ea71c99095b70852dab071d8a5f75412dfea797d01" exitCode=255 Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.495596 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" event={"ID":"460797d9-846a-453d-819d-a63b74ae55e4","Type":"ContainerDied","Data":"b3723134984310ed429072ea71c99095b70852dab071d8a5f75412dfea797d01"} Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.496145 4852 scope.go:117] "RemoveContainer" containerID="b3723134984310ed429072ea71c99095b70852dab071d8a5f75412dfea797d01" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.807664 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.813194 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 10 11:56:26 crc kubenswrapper[4852]: I1210 11:56:26.886121 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.403649 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.432450 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.443788 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.504870 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cc6d8cb6b-74b6c_460797d9-846a-453d-819d-a63b74ae55e4/oauth-openshift/0.log" Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.504948 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" event={"ID":"460797d9-846a-453d-819d-a63b74ae55e4","Type":"ContainerStarted","Data":"65f2ba1ff2968b00c0edc297aa06c9ee00cbbafdfee482cee15f72dce5284024"} Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.506401 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.506785 4852 patch_prober.go:28] interesting pod/oauth-openshift-cc6d8cb6b-74b6c container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.506827 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" podUID="460797d9-846a-453d-819d-a63b74ae55e4" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.580800 4852 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.581053 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a" gracePeriod=5 Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.638904 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 10 11:56:27 crc kubenswrapper[4852]: I1210 11:56:27.960881 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 10 11:56:28 crc kubenswrapper[4852]: I1210 11:56:28.327294 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 10 11:56:28 crc kubenswrapper[4852]: I1210 11:56:28.438923 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 10 11:56:28 crc kubenswrapper[4852]: I1210 11:56:28.512871 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cc6d8cb6b-74b6c_460797d9-846a-453d-819d-a63b74ae55e4/oauth-openshift/1.log" Dec 10 11:56:28 crc kubenswrapper[4852]: I1210 11:56:28.513615 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cc6d8cb6b-74b6c_460797d9-846a-453d-819d-a63b74ae55e4/oauth-openshift/0.log" Dec 10 11:56:28 crc kubenswrapper[4852]: I1210 11:56:28.513702 4852 generic.go:334] "Generic (PLEG): container finished" podID="460797d9-846a-453d-819d-a63b74ae55e4" containerID="65f2ba1ff2968b00c0edc297aa06c9ee00cbbafdfee482cee15f72dce5284024" exitCode=255 Dec 10 11:56:28 crc kubenswrapper[4852]: I1210 11:56:28.513752 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" event={"ID":"460797d9-846a-453d-819d-a63b74ae55e4","Type":"ContainerDied","Data":"65f2ba1ff2968b00c0edc297aa06c9ee00cbbafdfee482cee15f72dce5284024"} Dec 10 11:56:28 crc kubenswrapper[4852]: I1210 11:56:28.513806 4852 scope.go:117] "RemoveContainer" containerID="b3723134984310ed429072ea71c99095b70852dab071d8a5f75412dfea797d01" Dec 10 11:56:28 crc kubenswrapper[4852]: I1210 11:56:28.514587 4852 scope.go:117] "RemoveContainer" containerID="65f2ba1ff2968b00c0edc297aa06c9ee00cbbafdfee482cee15f72dce5284024" Dec 10 11:56:28 crc kubenswrapper[4852]: E1210 11:56:28.514849 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cc6d8cb6b-74b6c_openshift-authentication(460797d9-846a-453d-819d-a63b74ae55e4)\"" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" podUID="460797d9-846a-453d-819d-a63b74ae55e4" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.096309 4852 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.153644 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.170297 4852 scope.go:117] "RemoveContainer" containerID="f466e917954b90e012d340ea25b8b410327d3d7eee4f3803597e3074b432fdc5" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.342052 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.408765 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.521348 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cc6d8cb6b-74b6c_460797d9-846a-453d-819d-a63b74ae55e4/oauth-openshift/1.log" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.521883 4852 scope.go:117] "RemoveContainer" containerID="65f2ba1ff2968b00c0edc297aa06c9ee00cbbafdfee482cee15f72dce5284024" Dec 10 11:56:29 crc kubenswrapper[4852]: E1210 11:56:29.522047 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cc6d8cb6b-74b6c_openshift-authentication(460797d9-846a-453d-819d-a63b74ae55e4)\"" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" podUID="460797d9-846a-453d-819d-a63b74ae55e4" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.524068 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4cv5l_1919e18e-d914-4ee7-8bf4-6de02e6760c2/marketplace-operator/1.log" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.524102 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" event={"ID":"1919e18e-d914-4ee7-8bf4-6de02e6760c2","Type":"ContainerStarted","Data":"354da5458a6e60aebf6ae46f7fbeb789f20298d4997e8b6637776e045edc2a74"} Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.524763 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.525944 4852 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4cv5l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.525985 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 10 11:56:29 crc kubenswrapper[4852]: I1210 11:56:29.708263 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 10 11:56:30 crc kubenswrapper[4852]: I1210 11:56:30.132478 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 10 11:56:30 crc kubenswrapper[4852]: I1210 11:56:30.351511 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 10 11:56:30 crc kubenswrapper[4852]: I1210 11:56:30.533175 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:56:31 crc kubenswrapper[4852]: I1210 11:56:31.932793 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 10 11:56:32 crc kubenswrapper[4852]: I1210 11:56:32.433985 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 10 11:56:32 crc kubenswrapper[4852]: I1210 11:56:32.470813 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 10 11:56:32 crc kubenswrapper[4852]: I1210 11:56:32.738105 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.170213 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.170293 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.311527 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.311711 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.311742 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.311735 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.311763 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.311852 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.311881 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.311936 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.312044 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.312861 4852 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.312902 4852 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.312922 4852 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.312933 4852 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.320766 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.414004 4852 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.420459 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.545229 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.545325 4852 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a" exitCode=137 Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.545394 4852 scope.go:117] "RemoveContainer" containerID="1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.545544 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.565968 4852 scope.go:117] "RemoveContainer" containerID="1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a" Dec 10 11:56:33 crc kubenswrapper[4852]: E1210 11:56:33.566916 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a\": container with ID starting with 1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a not found: ID does not exist" containerID="1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.567018 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a"} err="failed to get container status \"1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a\": rpc error: code = NotFound desc = could not find container \"1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a\": container with ID starting with 1421c2aebd3eea6f568e81c8bacda655e646914f7957ef7a6df14e08ec364f8a not found: ID does not exist" Dec 10 11:56:33 crc kubenswrapper[4852]: I1210 11:56:33.962833 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.176375 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.176620 4852 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.187216 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.187879 4852 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="38d6433f-3c5e-486e-9233-19fc9c46fa92" Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.191614 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.191660 4852 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="38d6433f-3c5e-486e-9233-19fc9c46fa92" Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.601004 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.952322 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:34 crc kubenswrapper[4852]: I1210 11:56:34.954028 4852 scope.go:117] "RemoveContainer" containerID="65f2ba1ff2968b00c0edc297aa06c9ee00cbbafdfee482cee15f72dce5284024" Dec 10 11:56:34 crc kubenswrapper[4852]: E1210 11:56:34.954463 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cc6d8cb6b-74b6c_openshift-authentication(460797d9-846a-453d-819d-a63b74ae55e4)\"" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" podUID="460797d9-846a-453d-819d-a63b74ae55e4" Dec 10 11:56:35 crc kubenswrapper[4852]: I1210 11:56:35.439608 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 10 11:56:49 crc kubenswrapper[4852]: I1210 11:56:49.170473 4852 scope.go:117] "RemoveContainer" containerID="65f2ba1ff2968b00c0edc297aa06c9ee00cbbafdfee482cee15f72dce5284024" Dec 10 11:56:49 crc kubenswrapper[4852]: I1210 11:56:49.649429 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cc6d8cb6b-74b6c_460797d9-846a-453d-819d-a63b74ae55e4/oauth-openshift/1.log" Dec 10 11:56:49 crc kubenswrapper[4852]: I1210 11:56:49.649504 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" event={"ID":"460797d9-846a-453d-819d-a63b74ae55e4","Type":"ContainerStarted","Data":"828d5dbeaeff4294ed8633f2d2c13e5aae1366b76d825bff44221dced1f29fb2"} Dec 10 11:56:49 crc kubenswrapper[4852]: I1210 11:56:49.650871 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:49 crc kubenswrapper[4852]: I1210 11:56:49.822840 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-cc6d8cb6b-74b6c" Dec 10 11:56:54 crc kubenswrapper[4852]: I1210 11:56:54.053177 4852 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 10 11:56:59 crc kubenswrapper[4852]: I1210 11:56:59.465336 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wzdw"] Dec 10 11:56:59 crc kubenswrapper[4852]: I1210 11:56:59.466727 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" podUID="400678dc-74d6-4b93-aa8a-7468710877d6" containerName="controller-manager" containerID="cri-o://6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795" gracePeriod=30 Dec 10 11:56:59 crc kubenswrapper[4852]: I1210 11:56:59.586292 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw"] Dec 10 11:56:59 crc kubenswrapper[4852]: I1210 11:56:59.587115 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" podUID="76310b37-2f80-4da3-8b7e-8dde4ce8117c" containerName="route-controller-manager" containerID="cri-o://67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850" gracePeriod=30 Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.389027 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.426868 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-proxy-ca-bundles\") pod \"400678dc-74d6-4b93-aa8a-7468710877d6\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.427029 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-client-ca\") pod \"400678dc-74d6-4b93-aa8a-7468710877d6\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.427099 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc7c8\" (UniqueName: \"kubernetes.io/projected/400678dc-74d6-4b93-aa8a-7468710877d6-kube-api-access-qc7c8\") pod \"400678dc-74d6-4b93-aa8a-7468710877d6\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.427263 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400678dc-74d6-4b93-aa8a-7468710877d6-serving-cert\") pod \"400678dc-74d6-4b93-aa8a-7468710877d6\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.427336 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-config\") pod \"400678dc-74d6-4b93-aa8a-7468710877d6\" (UID: \"400678dc-74d6-4b93-aa8a-7468710877d6\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.428625 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-client-ca" (OuterVolumeSpecName: "client-ca") pod "400678dc-74d6-4b93-aa8a-7468710877d6" (UID: "400678dc-74d6-4b93-aa8a-7468710877d6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.429151 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-config" (OuterVolumeSpecName: "config") pod "400678dc-74d6-4b93-aa8a-7468710877d6" (UID: "400678dc-74d6-4b93-aa8a-7468710877d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.429305 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "400678dc-74d6-4b93-aa8a-7468710877d6" (UID: "400678dc-74d6-4b93-aa8a-7468710877d6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.436397 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/400678dc-74d6-4b93-aa8a-7468710877d6-kube-api-access-qc7c8" (OuterVolumeSpecName: "kube-api-access-qc7c8") pod "400678dc-74d6-4b93-aa8a-7468710877d6" (UID: "400678dc-74d6-4b93-aa8a-7468710877d6"). InnerVolumeSpecName "kube-api-access-qc7c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.437525 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/400678dc-74d6-4b93-aa8a-7468710877d6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "400678dc-74d6-4b93-aa8a-7468710877d6" (UID: "400678dc-74d6-4b93-aa8a-7468710877d6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.529600 4852 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.529658 4852 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.529673 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc7c8\" (UniqueName: \"kubernetes.io/projected/400678dc-74d6-4b93-aa8a-7468710877d6-kube-api-access-qc7c8\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.529689 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400678dc-74d6-4b93-aa8a-7468710877d6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.529701 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400678dc-74d6-4b93-aa8a-7468710877d6-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.573861 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.630475 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-client-ca\") pod \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.630638 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-config\") pod \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.630679 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw6v5\" (UniqueName: \"kubernetes.io/projected/76310b37-2f80-4da3-8b7e-8dde4ce8117c-kube-api-access-lw6v5\") pod \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.630708 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76310b37-2f80-4da3-8b7e-8dde4ce8117c-serving-cert\") pod \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\" (UID: \"76310b37-2f80-4da3-8b7e-8dde4ce8117c\") " Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.631817 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-client-ca" (OuterVolumeSpecName: "client-ca") pod "76310b37-2f80-4da3-8b7e-8dde4ce8117c" (UID: "76310b37-2f80-4da3-8b7e-8dde4ce8117c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.631974 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-config" (OuterVolumeSpecName: "config") pod "76310b37-2f80-4da3-8b7e-8dde4ce8117c" (UID: "76310b37-2f80-4da3-8b7e-8dde4ce8117c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.636851 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76310b37-2f80-4da3-8b7e-8dde4ce8117c-kube-api-access-lw6v5" (OuterVolumeSpecName: "kube-api-access-lw6v5") pod "76310b37-2f80-4da3-8b7e-8dde4ce8117c" (UID: "76310b37-2f80-4da3-8b7e-8dde4ce8117c"). InnerVolumeSpecName "kube-api-access-lw6v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.641034 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76310b37-2f80-4da3-8b7e-8dde4ce8117c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76310b37-2f80-4da3-8b7e-8dde4ce8117c" (UID: "76310b37-2f80-4da3-8b7e-8dde4ce8117c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.719563 4852 generic.go:334] "Generic (PLEG): container finished" podID="400678dc-74d6-4b93-aa8a-7468710877d6" containerID="6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795" exitCode=0 Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.719678 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" event={"ID":"400678dc-74d6-4b93-aa8a-7468710877d6","Type":"ContainerDied","Data":"6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795"} Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.719729 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" event={"ID":"400678dc-74d6-4b93-aa8a-7468710877d6","Type":"ContainerDied","Data":"ff27122228455340acda3dbb16731b99cb2fb365d88f45d455df27e22f9fc4fc"} Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.719754 4852 scope.go:117] "RemoveContainer" containerID="6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.719788 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7wzdw" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.721396 4852 generic.go:334] "Generic (PLEG): container finished" podID="76310b37-2f80-4da3-8b7e-8dde4ce8117c" containerID="67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850" exitCode=0 Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.721428 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" event={"ID":"76310b37-2f80-4da3-8b7e-8dde4ce8117c","Type":"ContainerDied","Data":"67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850"} Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.721449 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" event={"ID":"76310b37-2f80-4da3-8b7e-8dde4ce8117c","Type":"ContainerDied","Data":"38894768e0a5539f283e85f0509bcb7337f97851b0f05577515f45e3c9a8367a"} Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.721504 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.732146 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.732219 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw6v5\" (UniqueName: \"kubernetes.io/projected/76310b37-2f80-4da3-8b7e-8dde4ce8117c-kube-api-access-lw6v5\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.732247 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76310b37-2f80-4da3-8b7e-8dde4ce8117c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.732259 4852 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76310b37-2f80-4da3-8b7e-8dde4ce8117c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.755129 4852 scope.go:117] "RemoveContainer" containerID="6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795" Dec 10 11:57:00 crc kubenswrapper[4852]: E1210 11:57:00.756775 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795\": container with ID starting with 6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795 not found: ID does not exist" containerID="6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.756815 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795"} err="failed to get container status \"6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795\": rpc error: code = NotFound desc = could not find container \"6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795\": container with ID starting with 6e9c3ddc6f2313d785365a66be7614ee73b15584e67a60538bbfed6f28e76795 not found: ID does not exist" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.756845 4852 scope.go:117] "RemoveContainer" containerID="67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.758594 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw"] Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.769194 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6bdw"] Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.780539 4852 scope.go:117] "RemoveContainer" containerID="67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850" Dec 10 11:57:00 crc kubenswrapper[4852]: E1210 11:57:00.781297 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850\": container with ID starting with 67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850 not found: ID does not exist" containerID="67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.781329 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850"} err="failed to get container status \"67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850\": rpc error: code = NotFound desc = could not find container \"67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850\": container with ID starting with 67262c1142b46090edcbdc01530f63221e076c61848b50599eb32e0cd44f6850 not found: ID does not exist" Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.801452 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wzdw"] Dec 10 11:57:00 crc kubenswrapper[4852]: I1210 11:57:00.804455 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wzdw"] Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.085728 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7"] Dec 10 11:57:01 crc kubenswrapper[4852]: E1210 11:57:01.086170 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76310b37-2f80-4da3-8b7e-8dde4ce8117c" containerName="route-controller-manager" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.086193 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="76310b37-2f80-4da3-8b7e-8dde4ce8117c" containerName="route-controller-manager" Dec 10 11:57:01 crc kubenswrapper[4852]: E1210 11:57:01.086205 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400678dc-74d6-4b93-aa8a-7468710877d6" containerName="controller-manager" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.086215 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="400678dc-74d6-4b93-aa8a-7468710877d6" containerName="controller-manager" Dec 10 11:57:01 crc kubenswrapper[4852]: E1210 11:57:01.086263 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.086276 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.086631 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="400678dc-74d6-4b93-aa8a-7468710877d6" containerName="controller-manager" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.086649 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.086658 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="76310b37-2f80-4da3-8b7e-8dde4ce8117c" containerName="route-controller-manager" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.087401 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.091071 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.091080 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.091155 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.092076 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.092135 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.092349 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.098055 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-599b86c796-fssjl"] Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.099161 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.102021 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.102100 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.102325 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.102381 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.103054 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.103210 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.104326 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7"] Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.119981 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-599b86c796-fssjl"] Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.124529 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.137710 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gchl\" (UniqueName: \"kubernetes.io/projected/eb28549f-fdfd-4603-8d46-ecc661e902ab-kube-api-access-5gchl\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.138068 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-config\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.138213 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-config\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.138367 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-proxy-ca-bundles\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.138487 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb28549f-fdfd-4603-8d46-ecc661e902ab-serving-cert\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.138571 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-client-ca\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.138755 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gm5h\" (UniqueName: \"kubernetes.io/projected/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-kube-api-access-6gm5h\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.138866 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-client-ca\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.138987 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-serving-cert\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.240835 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-proxy-ca-bundles\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.240997 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb28549f-fdfd-4603-8d46-ecc661e902ab-serving-cert\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.241051 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-client-ca\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.241092 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gm5h\" (UniqueName: \"kubernetes.io/projected/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-kube-api-access-6gm5h\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.241147 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-client-ca\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.241180 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-serving-cert\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.241312 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gchl\" (UniqueName: \"kubernetes.io/projected/eb28549f-fdfd-4603-8d46-ecc661e902ab-kube-api-access-5gchl\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.241423 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-config\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.241465 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-config\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.242718 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-client-ca\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.242760 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-proxy-ca-bundles\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.243730 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-config\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.243929 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-client-ca\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.244194 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-config\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.245921 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb28549f-fdfd-4603-8d46-ecc661e902ab-serving-cert\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.246048 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-serving-cert\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.264278 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gchl\" (UniqueName: \"kubernetes.io/projected/eb28549f-fdfd-4603-8d46-ecc661e902ab-kube-api-access-5gchl\") pod \"controller-manager-599b86c796-fssjl\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.264294 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gm5h\" (UniqueName: \"kubernetes.io/projected/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-kube-api-access-6gm5h\") pod \"route-controller-manager-86dcf56f75-9lhv7\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.422030 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.431415 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.657738 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-599b86c796-fssjl"] Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.709233 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7"] Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.740451 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" event={"ID":"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce","Type":"ContainerStarted","Data":"308104ddeb73503e09a59eabca323b35ed39a8dbbe4e418d3021832e179d4451"} Dec 10 11:57:01 crc kubenswrapper[4852]: I1210 11:57:01.743071 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" event={"ID":"eb28549f-fdfd-4603-8d46-ecc661e902ab","Type":"ContainerStarted","Data":"bd66b46025382b89861c72287a0b8edaee72db68cf2fe0bbb5d07dc13e17b699"} Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.177511 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400678dc-74d6-4b93-aa8a-7468710877d6" path="/var/lib/kubelet/pods/400678dc-74d6-4b93-aa8a-7468710877d6/volumes" Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.178241 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76310b37-2f80-4da3-8b7e-8dde4ce8117c" path="/var/lib/kubelet/pods/76310b37-2f80-4da3-8b7e-8dde4ce8117c/volumes" Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.752499 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" event={"ID":"eb28549f-fdfd-4603-8d46-ecc661e902ab","Type":"ContainerStarted","Data":"28da13432d1bc9eed4e569f56aac7ce813891af6e232e3bf8163a734d60f6629"} Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.752861 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.755054 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" event={"ID":"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce","Type":"ContainerStarted","Data":"0445566f9d600495a0f721ae01ff3b93d9b26996e17a0e18f89caf9f084a5309"} Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.756168 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.758218 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.761600 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:02 crc kubenswrapper[4852]: I1210 11:57:02.771746 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" podStartSLOduration=3.7717183050000003 podStartE2EDuration="3.771718305s" podCreationTimestamp="2025-12-10 11:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:57:02.770149872 +0000 UTC m=+308.855675096" watchObservedRunningTime="2025-12-10 11:57:02.771718305 +0000 UTC m=+308.857243549" Dec 10 11:57:39 crc kubenswrapper[4852]: I1210 11:57:39.449687 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" podStartSLOduration=40.449647438 podStartE2EDuration="40.449647438s" podCreationTimestamp="2025-12-10 11:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:57:02.816735514 +0000 UTC m=+308.902260758" watchObservedRunningTime="2025-12-10 11:57:39.449647438 +0000 UTC m=+345.535172662" Dec 10 11:57:39 crc kubenswrapper[4852]: I1210 11:57:39.453244 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-599b86c796-fssjl"] Dec 10 11:57:39 crc kubenswrapper[4852]: I1210 11:57:39.453702 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" podUID="eb28549f-fdfd-4603-8d46-ecc661e902ab" containerName="controller-manager" containerID="cri-o://28da13432d1bc9eed4e569f56aac7ce813891af6e232e3bf8163a734d60f6629" gracePeriod=30 Dec 10 11:57:39 crc kubenswrapper[4852]: I1210 11:57:39.457867 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7"] Dec 10 11:57:39 crc kubenswrapper[4852]: I1210 11:57:39.458163 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" podUID="5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" containerName="route-controller-manager" containerID="cri-o://0445566f9d600495a0f721ae01ff3b93d9b26996e17a0e18f89caf9f084a5309" gracePeriod=30 Dec 10 11:57:40 crc kubenswrapper[4852]: I1210 11:57:40.975569 4852 generic.go:334] "Generic (PLEG): container finished" podID="eb28549f-fdfd-4603-8d46-ecc661e902ab" containerID="28da13432d1bc9eed4e569f56aac7ce813891af6e232e3bf8163a734d60f6629" exitCode=0 Dec 10 11:57:40 crc kubenswrapper[4852]: I1210 11:57:40.975664 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" event={"ID":"eb28549f-fdfd-4603-8d46-ecc661e902ab","Type":"ContainerDied","Data":"28da13432d1bc9eed4e569f56aac7ce813891af6e232e3bf8163a734d60f6629"} Dec 10 11:57:40 crc kubenswrapper[4852]: I1210 11:57:40.979655 4852 generic.go:334] "Generic (PLEG): container finished" podID="5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" containerID="0445566f9d600495a0f721ae01ff3b93d9b26996e17a0e18f89caf9f084a5309" exitCode=0 Dec 10 11:57:40 crc kubenswrapper[4852]: I1210 11:57:40.979727 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" event={"ID":"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce","Type":"ContainerDied","Data":"0445566f9d600495a0f721ae01ff3b93d9b26996e17a0e18f89caf9f084a5309"} Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.119424 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.153378 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt"] Dec 10 11:57:41 crc kubenswrapper[4852]: E1210 11:57:41.153850 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" containerName="route-controller-manager" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.153869 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" containerName="route-controller-manager" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.154043 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" containerName="route-controller-manager" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.154797 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.166701 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt"] Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.206668 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.247696 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-client-ca\") pod \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.247816 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gm5h\" (UniqueName: \"kubernetes.io/projected/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-kube-api-access-6gm5h\") pod \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.247858 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-serving-cert\") pod \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.248803 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" (UID: "5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.249153 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-config\") pod \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\" (UID: \"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.249410 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwcj8\" (UniqueName: \"kubernetes.io/projected/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-kube-api-access-gwcj8\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.249487 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-serving-cert\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.249919 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-config\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.249935 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-config" (OuterVolumeSpecName: "config") pod "5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" (UID: "5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.249963 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-client-ca\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.250166 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.250219 4852 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-client-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.254162 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-kube-api-access-6gm5h" (OuterVolumeSpecName: "kube-api-access-6gm5h") pod "5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" (UID: "5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce"). InnerVolumeSpecName "kube-api-access-6gm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.254298 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" (UID: "5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.350658 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-proxy-ca-bundles\") pod \"eb28549f-fdfd-4603-8d46-ecc661e902ab\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.350703 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gchl\" (UniqueName: \"kubernetes.io/projected/eb28549f-fdfd-4603-8d46-ecc661e902ab-kube-api-access-5gchl\") pod \"eb28549f-fdfd-4603-8d46-ecc661e902ab\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.350759 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb28549f-fdfd-4603-8d46-ecc661e902ab-serving-cert\") pod \"eb28549f-fdfd-4603-8d46-ecc661e902ab\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.350797 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-config\") pod \"eb28549f-fdfd-4603-8d46-ecc661e902ab\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.350857 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-client-ca\") pod \"eb28549f-fdfd-4603-8d46-ecc661e902ab\" (UID: \"eb28549f-fdfd-4603-8d46-ecc661e902ab\") " Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.351085 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-config\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.351108 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-client-ca\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.351141 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwcj8\" (UniqueName: \"kubernetes.io/projected/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-kube-api-access-gwcj8\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.351166 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-serving-cert\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.351288 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gm5h\" (UniqueName: \"kubernetes.io/projected/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-kube-api-access-6gm5h\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.351308 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.352083 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb28549f-fdfd-4603-8d46-ecc661e902ab" (UID: "eb28549f-fdfd-4603-8d46-ecc661e902ab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.352094 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eb28549f-fdfd-4603-8d46-ecc661e902ab" (UID: "eb28549f-fdfd-4603-8d46-ecc661e902ab"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.352140 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-config" (OuterVolumeSpecName: "config") pod "eb28549f-fdfd-4603-8d46-ecc661e902ab" (UID: "eb28549f-fdfd-4603-8d46-ecc661e902ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.353321 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-config\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.353567 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-client-ca\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.355439 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb28549f-fdfd-4603-8d46-ecc661e902ab-kube-api-access-5gchl" (OuterVolumeSpecName: "kube-api-access-5gchl") pod "eb28549f-fdfd-4603-8d46-ecc661e902ab" (UID: "eb28549f-fdfd-4603-8d46-ecc661e902ab"). InnerVolumeSpecName "kube-api-access-5gchl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.355802 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb28549f-fdfd-4603-8d46-ecc661e902ab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb28549f-fdfd-4603-8d46-ecc661e902ab" (UID: "eb28549f-fdfd-4603-8d46-ecc661e902ab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.356384 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-serving-cert\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.369708 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwcj8\" (UniqueName: \"kubernetes.io/projected/d3b25f6c-c695-4a2f-855f-ec7f580d1eb8-kube-api-access-gwcj8\") pod \"route-controller-manager-5766bfbddb-wwtpt\" (UID: \"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8\") " pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.452772 4852 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb28549f-fdfd-4603-8d46-ecc661e902ab-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.452817 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-config\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.452826 4852 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-client-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.452836 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gchl\" (UniqueName: \"kubernetes.io/projected/eb28549f-fdfd-4603-8d46-ecc661e902ab-kube-api-access-5gchl\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.452853 4852 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb28549f-fdfd-4603-8d46-ecc661e902ab-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.504799 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.688663 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt"] Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.993544 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" event={"ID":"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8","Type":"ContainerStarted","Data":"5ad42bc105f1ebbc302e6b8a1148a7af80e7b8271d1b0af269e48ab60c82016b"} Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.996476 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" event={"ID":"eb28549f-fdfd-4603-8d46-ecc661e902ab","Type":"ContainerDied","Data":"bd66b46025382b89861c72287a0b8edaee72db68cf2fe0bbb5d07dc13e17b699"} Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.996547 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-599b86c796-fssjl" Dec 10 11:57:41 crc kubenswrapper[4852]: I1210 11:57:41.996621 4852 scope.go:117] "RemoveContainer" containerID="28da13432d1bc9eed4e569f56aac7ce813891af6e232e3bf8163a734d60f6629" Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.000311 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" event={"ID":"5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce","Type":"ContainerDied","Data":"308104ddeb73503e09a59eabca323b35ed39a8dbbe4e418d3021832e179d4451"} Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.000429 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7" Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.016186 4852 scope.go:117] "RemoveContainer" containerID="0445566f9d600495a0f721ae01ff3b93d9b26996e17a0e18f89caf9f084a5309" Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.025891 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-599b86c796-fssjl"] Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.031829 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-599b86c796-fssjl"] Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.043439 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7"] Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.046470 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86dcf56f75-9lhv7"] Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.176184 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce" path="/var/lib/kubelet/pods/5243dda8-7e2c-4cf4-aae9-39d0bf1d11ce/volumes" Dec 10 11:57:42 crc kubenswrapper[4852]: I1210 11:57:42.176810 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb28549f-fdfd-4603-8d46-ecc661e902ab" path="/var/lib/kubelet/pods/eb28549f-fdfd-4603-8d46-ecc661e902ab/volumes" Dec 10 11:57:43 crc kubenswrapper[4852]: I1210 11:57:43.007018 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" event={"ID":"d3b25f6c-c695-4a2f-855f-ec7f580d1eb8","Type":"ContainerStarted","Data":"008aab1dab851bfed51910c1c38f1206fbbb6b6b58e50023fa26f1983b0be687"} Dec 10 11:57:43 crc kubenswrapper[4852]: I1210 11:57:43.007289 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:43 crc kubenswrapper[4852]: I1210 11:57:43.011912 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" Dec 10 11:57:43 crc kubenswrapper[4852]: I1210 11:57:43.031524 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5766bfbddb-wwtpt" podStartSLOduration=4.031509092 podStartE2EDuration="4.031509092s" podCreationTimestamp="2025-12-10 11:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:57:43.027721972 +0000 UTC m=+349.113247206" watchObservedRunningTime="2025-12-10 11:57:43.031509092 +0000 UTC m=+349.117034316" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.108337 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57968d6747-wkzk9"] Dec 10 11:57:44 crc kubenswrapper[4852]: E1210 11:57:44.109206 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb28549f-fdfd-4603-8d46-ecc661e902ab" containerName="controller-manager" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.109324 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb28549f-fdfd-4603-8d46-ecc661e902ab" containerName="controller-manager" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.109552 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb28549f-fdfd-4603-8d46-ecc661e902ab" containerName="controller-manager" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.110078 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.115683 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.116381 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.116564 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.116745 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.116916 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.118628 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8n28\" (UniqueName: \"kubernetes.io/projected/977c260a-5c87-4795-90d2-534d4099813e-kube-api-access-j8n28\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.118696 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-config\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.118772 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/977c260a-5c87-4795-90d2-534d4099813e-serving-cert\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.118798 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-proxy-ca-bundles\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.118837 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-client-ca\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.118858 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.125972 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.132168 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57968d6747-wkzk9"] Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.219594 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8n28\" (UniqueName: \"kubernetes.io/projected/977c260a-5c87-4795-90d2-534d4099813e-kube-api-access-j8n28\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.219662 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-config\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.219990 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/977c260a-5c87-4795-90d2-534d4099813e-serving-cert\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.220036 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-proxy-ca-bundles\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.220098 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-client-ca\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.221090 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-client-ca\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.221547 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-proxy-ca-bundles\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.223179 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/977c260a-5c87-4795-90d2-534d4099813e-config\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.226168 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/977c260a-5c87-4795-90d2-534d4099813e-serving-cert\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.238691 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8n28\" (UniqueName: \"kubernetes.io/projected/977c260a-5c87-4795-90d2-534d4099813e-kube-api-access-j8n28\") pod \"controller-manager-57968d6747-wkzk9\" (UID: \"977c260a-5c87-4795-90d2-534d4099813e\") " pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.427043 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:44 crc kubenswrapper[4852]: I1210 11:57:44.613599 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57968d6747-wkzk9"] Dec 10 11:57:45 crc kubenswrapper[4852]: I1210 11:57:45.020850 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" event={"ID":"977c260a-5c87-4795-90d2-534d4099813e","Type":"ContainerStarted","Data":"0fecb759c4f3f3902bd6c9da912b5dfc06926693b45438ac342ef67d20dffd99"} Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.026090 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" event={"ID":"977c260a-5c87-4795-90d2-534d4099813e","Type":"ContainerStarted","Data":"9928897a8b7a611f92254274b18041ae3c77ce3f3e9f3b72b538e953089e58b1"} Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.026363 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.031555 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.041975 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57968d6747-wkzk9" podStartSLOduration=7.041958632 podStartE2EDuration="7.041958632s" podCreationTimestamp="2025-12-10 11:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:57:46.041185252 +0000 UTC m=+352.126710486" watchObservedRunningTime="2025-12-10 11:57:46.041958632 +0000 UTC m=+352.127483856" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.689579 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f7g78"] Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.690519 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.703285 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f7g78"] Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.851573 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.851638 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-trusted-ca\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.851692 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.851729 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.851757 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-registry-certificates\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.851794 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-registry-tls\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.851928 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54q84\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-kube-api-access-54q84\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.852134 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-bound-sa-token\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.876438 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.953545 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-registry-tls\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.953588 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54q84\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-kube-api-access-54q84\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.953641 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-bound-sa-token\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.953664 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.953689 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-trusted-ca\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.953723 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.953742 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-registry-certificates\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.954435 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.954913 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-registry-certificates\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.956200 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-trusted-ca\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.960128 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.960486 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-registry-tls\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.970671 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54q84\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-kube-api-access-54q84\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:46 crc kubenswrapper[4852]: I1210 11:57:46.974008 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44c4a4d7-e189-4ecd-8fa8-ea0d738536fa-bound-sa-token\") pod \"image-registry-66df7c8f76-f7g78\" (UID: \"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:47 crc kubenswrapper[4852]: I1210 11:57:47.006558 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:47 crc kubenswrapper[4852]: I1210 11:57:47.403478 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f7g78"] Dec 10 11:57:47 crc kubenswrapper[4852]: W1210 11:57:47.407244 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c4a4d7_e189_4ecd_8fa8_ea0d738536fa.slice/crio-3240b0e57e2877fbf3f31d64e4ff99297e28d8e5e045b21139262a724796a513 WatchSource:0}: Error finding container 3240b0e57e2877fbf3f31d64e4ff99297e28d8e5e045b21139262a724796a513: Status 404 returned error can't find the container with id 3240b0e57e2877fbf3f31d64e4ff99297e28d8e5e045b21139262a724796a513 Dec 10 11:57:48 crc kubenswrapper[4852]: I1210 11:57:48.037480 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" event={"ID":"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa","Type":"ContainerStarted","Data":"22146ff6a06fe4760c6369052924946cd37cf81f1a174e6c135d3435b5ef3184"} Dec 10 11:57:48 crc kubenswrapper[4852]: I1210 11:57:48.037798 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" event={"ID":"44c4a4d7-e189-4ecd-8fa8-ea0d738536fa","Type":"ContainerStarted","Data":"3240b0e57e2877fbf3f31d64e4ff99297e28d8e5e045b21139262a724796a513"} Dec 10 11:57:48 crc kubenswrapper[4852]: I1210 11:57:48.037821 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:57:48 crc kubenswrapper[4852]: I1210 11:57:48.055252 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" podStartSLOduration=2.05520788 podStartE2EDuration="2.05520788s" podCreationTimestamp="2025-12-10 11:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:57:48.054768828 +0000 UTC m=+354.140294042" watchObservedRunningTime="2025-12-10 11:57:48.05520788 +0000 UTC m=+354.140733104" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.021579 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dn8dv"] Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.022940 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dn8dv" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerName="registry-server" containerID="cri-o://a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7" gracePeriod=2 Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.221983 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h2rb2"] Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.223134 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h2rb2" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="registry-server" containerID="cri-o://a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe" gracePeriod=2 Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.596456 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.766784 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tswc\" (UniqueName: \"kubernetes.io/projected/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-kube-api-access-5tswc\") pod \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.767169 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-catalog-content\") pod \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.767217 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-utilities\") pod \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\" (UID: \"eb61d8f4-66d0-4d11-955f-4984ab5e18e6\") " Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.768226 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-utilities" (OuterVolumeSpecName: "utilities") pod "eb61d8f4-66d0-4d11-955f-4984ab5e18e6" (UID: "eb61d8f4-66d0-4d11-955f-4984ab5e18e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.783530 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-kube-api-access-5tswc" (OuterVolumeSpecName: "kube-api-access-5tswc") pod "eb61d8f4-66d0-4d11-955f-4984ab5e18e6" (UID: "eb61d8f4-66d0-4d11-955f-4984ab5e18e6"). InnerVolumeSpecName "kube-api-access-5tswc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.822079 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb61d8f4-66d0-4d11-955f-4984ab5e18e6" (UID: "eb61d8f4-66d0-4d11-955f-4984ab5e18e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.832342 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.869078 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.869120 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.869135 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tswc\" (UniqueName: \"kubernetes.io/projected/eb61d8f4-66d0-4d11-955f-4984ab5e18e6-kube-api-access-5tswc\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.971673 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jk2g\" (UniqueName: \"kubernetes.io/projected/4aa25f22-5823-46d9-ae2b-a507642dc0df-kube-api-access-8jk2g\") pod \"4aa25f22-5823-46d9-ae2b-a507642dc0df\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.971865 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-utilities\") pod \"4aa25f22-5823-46d9-ae2b-a507642dc0df\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.971910 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-catalog-content\") pod \"4aa25f22-5823-46d9-ae2b-a507642dc0df\" (UID: \"4aa25f22-5823-46d9-ae2b-a507642dc0df\") " Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.972766 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-utilities" (OuterVolumeSpecName: "utilities") pod "4aa25f22-5823-46d9-ae2b-a507642dc0df" (UID: "4aa25f22-5823-46d9-ae2b-a507642dc0df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:01 crc kubenswrapper[4852]: I1210 11:58:01.977388 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa25f22-5823-46d9-ae2b-a507642dc0df-kube-api-access-8jk2g" (OuterVolumeSpecName: "kube-api-access-8jk2g") pod "4aa25f22-5823-46d9-ae2b-a507642dc0df" (UID: "4aa25f22-5823-46d9-ae2b-a507642dc0df"). InnerVolumeSpecName "kube-api-access-8jk2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.023477 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aa25f22-5823-46d9-ae2b-a507642dc0df" (UID: "4aa25f22-5823-46d9-ae2b-a507642dc0df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.079915 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jk2g\" (UniqueName: \"kubernetes.io/projected/4aa25f22-5823-46d9-ae2b-a507642dc0df-kube-api-access-8jk2g\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.079965 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.079984 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa25f22-5823-46d9-ae2b-a507642dc0df-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.133591 4852 generic.go:334] "Generic (PLEG): container finished" podID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerID="a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe" exitCode=0 Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.133675 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2rb2" event={"ID":"4aa25f22-5823-46d9-ae2b-a507642dc0df","Type":"ContainerDied","Data":"a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe"} Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.133710 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2rb2" event={"ID":"4aa25f22-5823-46d9-ae2b-a507642dc0df","Type":"ContainerDied","Data":"b3c9816b0ab42bff7a65915e97aaefdcd129b09b17a5413f566186ad1f8a7f29"} Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.133732 4852 scope.go:117] "RemoveContainer" containerID="a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.133872 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2rb2" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.138601 4852 generic.go:334] "Generic (PLEG): container finished" podID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerID="a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7" exitCode=0 Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.138654 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn8dv" event={"ID":"eb61d8f4-66d0-4d11-955f-4984ab5e18e6","Type":"ContainerDied","Data":"a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7"} Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.138690 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn8dv" event={"ID":"eb61d8f4-66d0-4d11-955f-4984ab5e18e6","Type":"ContainerDied","Data":"b9cf74512fece24b038c383a1d3321d5b2850737f1db9588a4b668d11df29064"} Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.138773 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dn8dv" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.154136 4852 scope.go:117] "RemoveContainer" containerID="8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.193155 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h2rb2"] Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.193228 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h2rb2"] Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.193271 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dn8dv"] Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.194690 4852 scope.go:117] "RemoveContainer" containerID="ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.201013 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dn8dv"] Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.220798 4852 scope.go:117] "RemoveContainer" containerID="a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe" Dec 10 11:58:02 crc kubenswrapper[4852]: E1210 11:58:02.221628 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe\": container with ID starting with a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe not found: ID does not exist" containerID="a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.221675 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe"} err="failed to get container status \"a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe\": rpc error: code = NotFound desc = could not find container \"a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe\": container with ID starting with a06d0cb5b022447ec1d5e89b4ca763f9a4d85513793995fafd5928364ba982fe not found: ID does not exist" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.221711 4852 scope.go:117] "RemoveContainer" containerID="8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d" Dec 10 11:58:02 crc kubenswrapper[4852]: E1210 11:58:02.222068 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d\": container with ID starting with 8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d not found: ID does not exist" containerID="8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.222183 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d"} err="failed to get container status \"8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d\": rpc error: code = NotFound desc = could not find container \"8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d\": container with ID starting with 8a14242a960ecb6df809ad62f951d4ed1f12b96cac0d6c4aa33bf23d2460617d not found: ID does not exist" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.222317 4852 scope.go:117] "RemoveContainer" containerID="ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549" Dec 10 11:58:02 crc kubenswrapper[4852]: E1210 11:58:02.222796 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549\": container with ID starting with ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549 not found: ID does not exist" containerID="ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.222855 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549"} err="failed to get container status \"ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549\": rpc error: code = NotFound desc = could not find container \"ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549\": container with ID starting with ace3a58d964e063e8bf1a4703eaae4879302aa57b028818513bf663aa332a549 not found: ID does not exist" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.222900 4852 scope.go:117] "RemoveContainer" containerID="a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.240009 4852 scope.go:117] "RemoveContainer" containerID="3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.271709 4852 scope.go:117] "RemoveContainer" containerID="9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.288462 4852 scope.go:117] "RemoveContainer" containerID="a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7" Dec 10 11:58:02 crc kubenswrapper[4852]: E1210 11:58:02.289936 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7\": container with ID starting with a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7 not found: ID does not exist" containerID="a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.290009 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7"} err="failed to get container status \"a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7\": rpc error: code = NotFound desc = could not find container \"a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7\": container with ID starting with a26e31f4b96c2e4e8033405cb81cbfe3f468e29fe2c061fb09e05d310f5df5e7 not found: ID does not exist" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.290062 4852 scope.go:117] "RemoveContainer" containerID="3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1" Dec 10 11:58:02 crc kubenswrapper[4852]: E1210 11:58:02.290686 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1\": container with ID starting with 3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1 not found: ID does not exist" containerID="3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.290772 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1"} err="failed to get container status \"3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1\": rpc error: code = NotFound desc = could not find container \"3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1\": container with ID starting with 3ad6fd3f83614e8eccfacea57d513dcbf9677c5f568d590e1c3f62dc6885c9d1 not found: ID does not exist" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.290820 4852 scope.go:117] "RemoveContainer" containerID="9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0" Dec 10 11:58:02 crc kubenswrapper[4852]: E1210 11:58:02.291517 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0\": container with ID starting with 9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0 not found: ID does not exist" containerID="9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0" Dec 10 11:58:02 crc kubenswrapper[4852]: I1210 11:58:02.291552 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0"} err="failed to get container status \"9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0\": rpc error: code = NotFound desc = could not find container \"9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0\": container with ID starting with 9cfe241b2c25267387e6f6caf7962dc78c2574e3f9faa28af41e3e53384601d0 not found: ID does not exist" Dec 10 11:58:03 crc kubenswrapper[4852]: I1210 11:58:03.636363 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6d5jr"] Dec 10 11:58:03 crc kubenswrapper[4852]: I1210 11:58:03.637267 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6d5jr" podUID="0161f217-65f3-4afe-8037-281871787a8b" containerName="registry-server" containerID="cri-o://753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7" gracePeriod=2 Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.135067 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.158472 4852 generic.go:334] "Generic (PLEG): container finished" podID="0161f217-65f3-4afe-8037-281871787a8b" containerID="753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7" exitCode=0 Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.158521 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d5jr" event={"ID":"0161f217-65f3-4afe-8037-281871787a8b","Type":"ContainerDied","Data":"753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7"} Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.158550 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d5jr" event={"ID":"0161f217-65f3-4afe-8037-281871787a8b","Type":"ContainerDied","Data":"d16e2678bf9ab8b8631bf9f51e55d4a3b92ed74e515db69ff3f345af9795b98e"} Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.158567 4852 scope.go:117] "RemoveContainer" containerID="753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.158695 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d5jr" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.177827 4852 scope.go:117] "RemoveContainer" containerID="37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.183743 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" path="/var/lib/kubelet/pods/4aa25f22-5823-46d9-ae2b-a507642dc0df/volumes" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.184420 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" path="/var/lib/kubelet/pods/eb61d8f4-66d0-4d11-955f-4984ab5e18e6/volumes" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.201769 4852 scope.go:117] "RemoveContainer" containerID="4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.214831 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smxxt\" (UniqueName: \"kubernetes.io/projected/0161f217-65f3-4afe-8037-281871787a8b-kube-api-access-smxxt\") pod \"0161f217-65f3-4afe-8037-281871787a8b\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.214933 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-utilities\") pod \"0161f217-65f3-4afe-8037-281871787a8b\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.214965 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-catalog-content\") pod \"0161f217-65f3-4afe-8037-281871787a8b\" (UID: \"0161f217-65f3-4afe-8037-281871787a8b\") " Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.220223 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-utilities" (OuterVolumeSpecName: "utilities") pod "0161f217-65f3-4afe-8037-281871787a8b" (UID: "0161f217-65f3-4afe-8037-281871787a8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.220602 4852 scope.go:117] "RemoveContainer" containerID="753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7" Dec 10 11:58:04 crc kubenswrapper[4852]: E1210 11:58:04.221112 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7\": container with ID starting with 753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7 not found: ID does not exist" containerID="753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.221158 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7"} err="failed to get container status \"753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7\": rpc error: code = NotFound desc = could not find container \"753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7\": container with ID starting with 753dc9dc672393a7fa20841115cc324374b6c052369d4b0a62adada3b2d47bc7 not found: ID does not exist" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.221185 4852 scope.go:117] "RemoveContainer" containerID="37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f" Dec 10 11:58:04 crc kubenswrapper[4852]: E1210 11:58:04.221869 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f\": container with ID starting with 37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f not found: ID does not exist" containerID="37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.221907 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f"} err="failed to get container status \"37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f\": rpc error: code = NotFound desc = could not find container \"37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f\": container with ID starting with 37953c6b66eb1ab2968de52bce47b7c1405e3cf7605d40495ad92b7a36d9439f not found: ID does not exist" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.221933 4852 scope.go:117] "RemoveContainer" containerID="4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418" Dec 10 11:58:04 crc kubenswrapper[4852]: E1210 11:58:04.222210 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418\": container with ID starting with 4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418 not found: ID does not exist" containerID="4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.222262 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418"} err="failed to get container status \"4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418\": rpc error: code = NotFound desc = could not find container \"4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418\": container with ID starting with 4f6587c5c2e531bf34babc99c09e86cc95e88a30c6e2e2840daf21e843070418 not found: ID does not exist" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.222874 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0161f217-65f3-4afe-8037-281871787a8b-kube-api-access-smxxt" (OuterVolumeSpecName: "kube-api-access-smxxt") pod "0161f217-65f3-4afe-8037-281871787a8b" (UID: "0161f217-65f3-4afe-8037-281871787a8b"). InnerVolumeSpecName "kube-api-access-smxxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.316056 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.316101 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smxxt\" (UniqueName: \"kubernetes.io/projected/0161f217-65f3-4afe-8037-281871787a8b-kube-api-access-smxxt\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.333579 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0161f217-65f3-4afe-8037-281871787a8b" (UID: "0161f217-65f3-4afe-8037-281871787a8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.416987 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161f217-65f3-4afe-8037-281871787a8b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.492847 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6d5jr"] Dec 10 11:58:04 crc kubenswrapper[4852]: I1210 11:58:04.496756 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6d5jr"] Dec 10 11:58:06 crc kubenswrapper[4852]: I1210 11:58:06.176691 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0161f217-65f3-4afe-8037-281871787a8b" path="/var/lib/kubelet/pods/0161f217-65f3-4afe-8037-281871787a8b/volumes" Dec 10 11:58:07 crc kubenswrapper[4852]: I1210 11:58:07.012573 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-f7g78" Dec 10 11:58:07 crc kubenswrapper[4852]: I1210 11:58:07.070716 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2dss"] Dec 10 11:58:15 crc kubenswrapper[4852]: I1210 11:58:15.790151 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 11:58:15 crc kubenswrapper[4852]: I1210 11:58:15.790849 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.109918 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" podUID="db6ae1b8-eb2a-4790-a39f-37206d33525c" containerName="registry" containerID="cri-o://bf36db3f6de83443fd52ea4f6dca05f6b14e9506bb30c879b3f6bd8bf3a64653" gracePeriod=30 Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.327543 4852 generic.go:334] "Generic (PLEG): container finished" podID="db6ae1b8-eb2a-4790-a39f-37206d33525c" containerID="bf36db3f6de83443fd52ea4f6dca05f6b14e9506bb30c879b3f6bd8bf3a64653" exitCode=0 Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.327594 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" event={"ID":"db6ae1b8-eb2a-4790-a39f-37206d33525c","Type":"ContainerDied","Data":"bf36db3f6de83443fd52ea4f6dca05f6b14e9506bb30c879b3f6bd8bf3a64653"} Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.505978 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.518969 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db6ae1b8-eb2a-4790-a39f-37206d33525c-installation-pull-secrets\") pod \"db6ae1b8-eb2a-4790-a39f-37206d33525c\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.519014 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56rkd\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-kube-api-access-56rkd\") pod \"db6ae1b8-eb2a-4790-a39f-37206d33525c\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.519043 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db6ae1b8-eb2a-4790-a39f-37206d33525c-ca-trust-extracted\") pod \"db6ae1b8-eb2a-4790-a39f-37206d33525c\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.519263 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"db6ae1b8-eb2a-4790-a39f-37206d33525c\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.519300 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-certificates\") pod \"db6ae1b8-eb2a-4790-a39f-37206d33525c\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.519321 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-bound-sa-token\") pod \"db6ae1b8-eb2a-4790-a39f-37206d33525c\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.519342 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-tls\") pod \"db6ae1b8-eb2a-4790-a39f-37206d33525c\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.522248 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "db6ae1b8-eb2a-4790-a39f-37206d33525c" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.523720 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-trusted-ca\") pod \"db6ae1b8-eb2a-4790-a39f-37206d33525c\" (UID: \"db6ae1b8-eb2a-4790-a39f-37206d33525c\") " Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.524463 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "db6ae1b8-eb2a-4790-a39f-37206d33525c" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.528221 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-kube-api-access-56rkd" (OuterVolumeSpecName: "kube-api-access-56rkd") pod "db6ae1b8-eb2a-4790-a39f-37206d33525c" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c"). InnerVolumeSpecName "kube-api-access-56rkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.529388 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6ae1b8-eb2a-4790-a39f-37206d33525c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "db6ae1b8-eb2a-4790-a39f-37206d33525c" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.529722 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "db6ae1b8-eb2a-4790-a39f-37206d33525c" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.530792 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "db6ae1b8-eb2a-4790-a39f-37206d33525c" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.537553 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "db6ae1b8-eb2a-4790-a39f-37206d33525c" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.546865 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db6ae1b8-eb2a-4790-a39f-37206d33525c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "db6ae1b8-eb2a-4790-a39f-37206d33525c" (UID: "db6ae1b8-eb2a-4790-a39f-37206d33525c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.625389 4852 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.625466 4852 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.625484 4852 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.625497 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db6ae1b8-eb2a-4790-a39f-37206d33525c-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.625508 4852 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db6ae1b8-eb2a-4790-a39f-37206d33525c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.625519 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56rkd\" (UniqueName: \"kubernetes.io/projected/db6ae1b8-eb2a-4790-a39f-37206d33525c-kube-api-access-56rkd\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:32 crc kubenswrapper[4852]: I1210 11:58:32.625528 4852 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db6ae1b8-eb2a-4790-a39f-37206d33525c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:33 crc kubenswrapper[4852]: I1210 11:58:33.334478 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" event={"ID":"db6ae1b8-eb2a-4790-a39f-37206d33525c","Type":"ContainerDied","Data":"9ff74b7545d4ed40605610e28b657d721a789f1d916f87a4fd550fda87f630d0"} Dec 10 11:58:33 crc kubenswrapper[4852]: I1210 11:58:33.334877 4852 scope.go:117] "RemoveContainer" containerID="bf36db3f6de83443fd52ea4f6dca05f6b14e9506bb30c879b3f6bd8bf3a64653" Dec 10 11:58:33 crc kubenswrapper[4852]: I1210 11:58:33.334902 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m2dss" Dec 10 11:58:33 crc kubenswrapper[4852]: I1210 11:58:33.382628 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2dss"] Dec 10 11:58:33 crc kubenswrapper[4852]: I1210 11:58:33.390700 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2dss"] Dec 10 11:58:34 crc kubenswrapper[4852]: I1210 11:58:34.177367 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db6ae1b8-eb2a-4790-a39f-37206d33525c" path="/var/lib/kubelet/pods/db6ae1b8-eb2a-4790-a39f-37206d33525c/volumes" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.286153 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-662t9"] Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.287202 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-662t9" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="registry-server" containerID="cri-o://d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c" gracePeriod=30 Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.293588 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmxbw"] Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.293915 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xmxbw" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="registry-server" containerID="cri-o://7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56" gracePeriod=30 Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.312591 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4cv5l"] Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.313021 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" containerID="cri-o://354da5458a6e60aebf6ae46f7fbeb789f20298d4997e8b6637776e045edc2a74" gracePeriod=30 Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.328032 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqmsb"] Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.328392 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mqmsb" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="registry-server" containerID="cri-o://ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f" gracePeriod=30 Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.342658 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn5tj"] Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.343065 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cn5tj" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerName="registry-server" containerID="cri-o://b8a9ba403f31399fedaa5a98dcfc33c4dd6e5c72badf4910b85f4b875feac281" gracePeriod=30 Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.349769 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-64m8g"] Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.350856 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0161f217-65f3-4afe-8037-281871787a8b" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.350890 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="0161f217-65f3-4afe-8037-281871787a8b" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.350909 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="extract-content" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.350919 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="extract-content" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.350934 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerName="extract-utilities" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.350944 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerName="extract-utilities" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.350961 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerName="extract-content" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.350972 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerName="extract-content" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.350985 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0161f217-65f3-4afe-8037-281871787a8b" containerName="extract-utilities" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351000 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="0161f217-65f3-4afe-8037-281871787a8b" containerName="extract-utilities" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.351012 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6ae1b8-eb2a-4790-a39f-37206d33525c" containerName="registry" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351021 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6ae1b8-eb2a-4790-a39f-37206d33525c" containerName="registry" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.351031 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0161f217-65f3-4afe-8037-281871787a8b" containerName="extract-content" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351041 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="0161f217-65f3-4afe-8037-281871787a8b" containerName="extract-content" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.351051 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351059 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.351074 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="extract-utilities" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351083 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="extract-utilities" Dec 10 11:58:37 crc kubenswrapper[4852]: E1210 11:58:37.351101 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351113 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351364 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa25f22-5823-46d9-ae2b-a507642dc0df" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351384 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="0161f217-65f3-4afe-8037-281871787a8b" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351407 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6ae1b8-eb2a-4790-a39f-37206d33525c" containerName="registry" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.351432 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb61d8f4-66d0-4d11-955f-4984ab5e18e6" containerName="registry-server" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.352213 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.353843 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-64m8g"] Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.489694 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ff1e723c-986a-4c70-8340-aee0dacc330d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.489793 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff1e723c-986a-4c70-8340-aee0dacc330d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.489891 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj97m\" (UniqueName: \"kubernetes.io/projected/ff1e723c-986a-4c70-8340-aee0dacc330d-kube-api-access-nj97m\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.591001 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff1e723c-986a-4c70-8340-aee0dacc330d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.591079 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj97m\" (UniqueName: \"kubernetes.io/projected/ff1e723c-986a-4c70-8340-aee0dacc330d-kube-api-access-nj97m\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.591123 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ff1e723c-986a-4c70-8340-aee0dacc330d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.592680 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff1e723c-986a-4c70-8340-aee0dacc330d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.598479 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ff1e723c-986a-4c70-8340-aee0dacc330d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.613833 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj97m\" (UniqueName: \"kubernetes.io/projected/ff1e723c-986a-4c70-8340-aee0dacc330d-kube-api-access-nj97m\") pod \"marketplace-operator-79b997595-64m8g\" (UID: \"ff1e723c-986a-4c70-8340-aee0dacc330d\") " pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:37 crc kubenswrapper[4852]: I1210 11:58:37.677314 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.095344 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-64m8g"] Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.154330 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.199696 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-catalog-content\") pod \"2f614760-033c-494e-81d4-11c997e0db34\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.199737 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rq2c\" (UniqueName: \"kubernetes.io/projected/2f614760-033c-494e-81d4-11c997e0db34-kube-api-access-2rq2c\") pod \"2f614760-033c-494e-81d4-11c997e0db34\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.199759 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-utilities\") pod \"2f614760-033c-494e-81d4-11c997e0db34\" (UID: \"2f614760-033c-494e-81d4-11c997e0db34\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.200819 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-utilities" (OuterVolumeSpecName: "utilities") pod "2f614760-033c-494e-81d4-11c997e0db34" (UID: "2f614760-033c-494e-81d4-11c997e0db34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.240295 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f614760-033c-494e-81d4-11c997e0db34-kube-api-access-2rq2c" (OuterVolumeSpecName: "kube-api-access-2rq2c") pod "2f614760-033c-494e-81d4-11c997e0db34" (UID: "2f614760-033c-494e-81d4-11c997e0db34"). InnerVolumeSpecName "kube-api-access-2rq2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.254139 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.272754 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f614760-033c-494e-81d4-11c997e0db34" (UID: "2f614760-033c-494e-81d4-11c997e0db34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.279469 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f is running failed: container process not found" containerID="ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.279998 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f is running failed: container process not found" containerID="ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.280326 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f is running failed: container process not found" containerID="ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.280365 4852 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-mqmsb" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="registry-server" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.300694 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-utilities\") pod \"b65cc728-9de3-466f-902b-47f30708118c\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.300833 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xfrf\" (UniqueName: \"kubernetes.io/projected/b65cc728-9de3-466f-902b-47f30708118c-kube-api-access-5xfrf\") pod \"b65cc728-9de3-466f-902b-47f30708118c\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.300885 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-catalog-content\") pod \"b65cc728-9de3-466f-902b-47f30708118c\" (UID: \"b65cc728-9de3-466f-902b-47f30708118c\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.301347 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.301365 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rq2c\" (UniqueName: \"kubernetes.io/projected/2f614760-033c-494e-81d4-11c997e0db34-kube-api-access-2rq2c\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.301379 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f614760-033c-494e-81d4-11c997e0db34-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.302021 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-utilities" (OuterVolumeSpecName: "utilities") pod "b65cc728-9de3-466f-902b-47f30708118c" (UID: "b65cc728-9de3-466f-902b-47f30708118c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.331838 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65cc728-9de3-466f-902b-47f30708118c-kube-api-access-5xfrf" (OuterVolumeSpecName: "kube-api-access-5xfrf") pod "b65cc728-9de3-466f-902b-47f30708118c" (UID: "b65cc728-9de3-466f-902b-47f30708118c"). InnerVolumeSpecName "kube-api-access-5xfrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.384170 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4cv5l_1919e18e-d914-4ee7-8bf4-6de02e6760c2/marketplace-operator/1.log" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.384267 4852 generic.go:334] "Generic (PLEG): container finished" podID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerID="354da5458a6e60aebf6ae46f7fbeb789f20298d4997e8b6637776e045edc2a74" exitCode=0 Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.384353 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" event={"ID":"1919e18e-d914-4ee7-8bf4-6de02e6760c2","Type":"ContainerDied","Data":"354da5458a6e60aebf6ae46f7fbeb789f20298d4997e8b6637776e045edc2a74"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.384407 4852 scope.go:117] "RemoveContainer" containerID="f466e917954b90e012d340ea25b8b410327d3d7eee4f3803597e3074b432fdc5" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.391854 4852 generic.go:334] "Generic (PLEG): container finished" podID="2f614760-033c-494e-81d4-11c997e0db34" containerID="d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c" exitCode=0 Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.391910 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-662t9" event={"ID":"2f614760-033c-494e-81d4-11c997e0db34","Type":"ContainerDied","Data":"d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.391986 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-662t9" event={"ID":"2f614760-033c-494e-81d4-11c997e0db34","Type":"ContainerDied","Data":"27cbed4edd4bc3f76279910fe8eaa7b6593ee99b2f5f377f907f1a84db5fa840"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.391935 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-662t9" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.396067 4852 generic.go:334] "Generic (PLEG): container finished" podID="b65cc728-9de3-466f-902b-47f30708118c" containerID="7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56" exitCode=0 Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.396142 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmxbw" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.396180 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmxbw" event={"ID":"b65cc728-9de3-466f-902b-47f30708118c","Type":"ContainerDied","Data":"7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.396222 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmxbw" event={"ID":"b65cc728-9de3-466f-902b-47f30708118c","Type":"ContainerDied","Data":"c9bd79f657f60ab01cee544f42e8bf701c29650405a4715d2b02bcdd9d73406b"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.398644 4852 generic.go:334] "Generic (PLEG): container finished" podID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerID="b8a9ba403f31399fedaa5a98dcfc33c4dd6e5c72badf4910b85f4b875feac281" exitCode=0 Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.398697 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn5tj" event={"ID":"031e1f57-c87c-4d8f-a05a-380efb0979ec","Type":"ContainerDied","Data":"b8a9ba403f31399fedaa5a98dcfc33c4dd6e5c72badf4910b85f4b875feac281"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.400262 4852 generic.go:334] "Generic (PLEG): container finished" podID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerID="ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f" exitCode=0 Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.400303 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqmsb" event={"ID":"aa4dd551-8252-43a4-b1b3-d4daf088ddd5","Type":"ContainerDied","Data":"ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.418588 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" event={"ID":"ff1e723c-986a-4c70-8340-aee0dacc330d","Type":"ContainerStarted","Data":"006efbe1fc9b3f137402747b42e199205903c946f4d96080c9b0a89663b5dd67"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.418728 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" event={"ID":"ff1e723c-986a-4c70-8340-aee0dacc330d","Type":"ContainerStarted","Data":"65248a0d3c025fd50925c3d4d1c07bae9905f5727835c873177ba946ef2887ab"} Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.420109 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.421834 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b65cc728-9de3-466f-902b-47f30708118c" (UID: "b65cc728-9de3-466f-902b-47f30708118c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.424242 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xfrf\" (UniqueName: \"kubernetes.io/projected/b65cc728-9de3-466f-902b-47f30708118c-kube-api-access-5xfrf\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.424282 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.427294 4852 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-64m8g container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" start-of-body= Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.427358 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" podUID="ff1e723c-986a-4c70-8340-aee0dacc330d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.461946 4852 scope.go:117] "RemoveContainer" containerID="d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.466604 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" podStartSLOduration=1.466580548 podStartE2EDuration="1.466580548s" podCreationTimestamp="2025-12-10 11:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 11:58:38.465984563 +0000 UTC m=+404.551509787" watchObservedRunningTime="2025-12-10 11:58:38.466580548 +0000 UTC m=+404.552105772" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.492885 4852 scope.go:117] "RemoveContainer" containerID="a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.506426 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.507537 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-662t9"] Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.514661 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-662t9"] Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.528092 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65cc728-9de3-466f-902b-47f30708118c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.529855 4852 scope.go:117] "RemoveContainer" containerID="f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.542205 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.556287 4852 scope.go:117] "RemoveContainer" containerID="d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c" Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.556739 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c\": container with ID starting with d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c not found: ID does not exist" containerID="d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.556781 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c"} err="failed to get container status \"d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c\": rpc error: code = NotFound desc = could not find container \"d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c\": container with ID starting with d46c39d08fcaf17593b93ed5e2029fb2ca40ef34c0dfdf14d12770da4a0f080c not found: ID does not exist" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.556809 4852 scope.go:117] "RemoveContainer" containerID="a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef" Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.557045 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef\": container with ID starting with a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef not found: ID does not exist" containerID="a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.557075 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef"} err="failed to get container status \"a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef\": rpc error: code = NotFound desc = could not find container \"a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef\": container with ID starting with a1db1fda537a3f787ebb8d10b30bfc7922869ba0d5bfef268ef508c4e4cce5ef not found: ID does not exist" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.557092 4852 scope.go:117] "RemoveContainer" containerID="f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a" Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.557474 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a\": container with ID starting with f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a not found: ID does not exist" containerID="f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.557504 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a"} err="failed to get container status \"f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a\": rpc error: code = NotFound desc = could not find container \"f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a\": container with ID starting with f3d3757b1b37d7cc907577d7e3f681165dbd589260f25822ff848500c9705b2a not found: ID does not exist" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.557523 4852 scope.go:117] "RemoveContainer" containerID="7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.580809 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.581670 4852 scope.go:117] "RemoveContainer" containerID="41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.609880 4852 scope.go:117] "RemoveContainer" containerID="43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.634510 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-utilities\") pod \"031e1f57-c87c-4d8f-a05a-380efb0979ec\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.634651 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-catalog-content\") pod \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.634713 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-utilities\") pod \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.634749 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr5p8\" (UniqueName: \"kubernetes.io/projected/031e1f57-c87c-4d8f-a05a-380efb0979ec-kube-api-access-hr5p8\") pod \"031e1f57-c87c-4d8f-a05a-380efb0979ec\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.634790 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-catalog-content\") pod \"031e1f57-c87c-4d8f-a05a-380efb0979ec\" (UID: \"031e1f57-c87c-4d8f-a05a-380efb0979ec\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.634940 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-trusted-ca\") pod \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.634996 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29h7b\" (UniqueName: \"kubernetes.io/projected/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-kube-api-access-29h7b\") pod \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\" (UID: \"aa4dd551-8252-43a4-b1b3-d4daf088ddd5\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.635039 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq5cf\" (UniqueName: \"kubernetes.io/projected/1919e18e-d914-4ee7-8bf4-6de02e6760c2-kube-api-access-qq5cf\") pod \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.635083 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-operator-metrics\") pod \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\" (UID: \"1919e18e-d914-4ee7-8bf4-6de02e6760c2\") " Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.638239 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1919e18e-d914-4ee7-8bf4-6de02e6760c2" (UID: "1919e18e-d914-4ee7-8bf4-6de02e6760c2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.638586 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-utilities" (OuterVolumeSpecName: "utilities") pod "aa4dd551-8252-43a4-b1b3-d4daf088ddd5" (UID: "aa4dd551-8252-43a4-b1b3-d4daf088ddd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.641071 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031e1f57-c87c-4d8f-a05a-380efb0979ec-kube-api-access-hr5p8" (OuterVolumeSpecName: "kube-api-access-hr5p8") pod "031e1f57-c87c-4d8f-a05a-380efb0979ec" (UID: "031e1f57-c87c-4d8f-a05a-380efb0979ec"). InnerVolumeSpecName "kube-api-access-hr5p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.641314 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1919e18e-d914-4ee7-8bf4-6de02e6760c2-kube-api-access-qq5cf" (OuterVolumeSpecName: "kube-api-access-qq5cf") pod "1919e18e-d914-4ee7-8bf4-6de02e6760c2" (UID: "1919e18e-d914-4ee7-8bf4-6de02e6760c2"). InnerVolumeSpecName "kube-api-access-qq5cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.641413 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-kube-api-access-29h7b" (OuterVolumeSpecName: "kube-api-access-29h7b") pod "aa4dd551-8252-43a4-b1b3-d4daf088ddd5" (UID: "aa4dd551-8252-43a4-b1b3-d4daf088ddd5"). InnerVolumeSpecName "kube-api-access-29h7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.642774 4852 scope.go:117] "RemoveContainer" containerID="7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56" Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.643279 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56\": container with ID starting with 7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56 not found: ID does not exist" containerID="7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.643313 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56"} err="failed to get container status \"7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56\": rpc error: code = NotFound desc = could not find container \"7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56\": container with ID starting with 7321118b926acd6cb5151a73d75225496b1027dbc88bdfd0082d46be37af4e56 not found: ID does not exist" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.643343 4852 scope.go:117] "RemoveContainer" containerID="41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.643586 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1919e18e-d914-4ee7-8bf4-6de02e6760c2" (UID: "1919e18e-d914-4ee7-8bf4-6de02e6760c2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.643837 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4\": container with ID starting with 41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4 not found: ID does not exist" containerID="41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.643867 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4"} err="failed to get container status \"41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4\": rpc error: code = NotFound desc = could not find container \"41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4\": container with ID starting with 41ab9dee06525f22831d392484cb968495e7e248eaed79b8964b446586d3baa4 not found: ID does not exist" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.643884 4852 scope.go:117] "RemoveContainer" containerID="43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a" Dec 10 11:58:38 crc kubenswrapper[4852]: E1210 11:58:38.644776 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a\": container with ID starting with 43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a not found: ID does not exist" containerID="43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.644848 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a"} err="failed to get container status \"43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a\": rpc error: code = NotFound desc = could not find container \"43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a\": container with ID starting with 43ddf543fd7c25d8b568bc3086624e2ee52f3e114d577dcc8ae50f8af4b2379a not found: ID does not exist" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.645310 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-utilities" (OuterVolumeSpecName: "utilities") pod "031e1f57-c87c-4d8f-a05a-380efb0979ec" (UID: "031e1f57-c87c-4d8f-a05a-380efb0979ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.661309 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa4dd551-8252-43a4-b1b3-d4daf088ddd5" (UID: "aa4dd551-8252-43a4-b1b3-d4daf088ddd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737002 4852 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737065 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29h7b\" (UniqueName: \"kubernetes.io/projected/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-kube-api-access-29h7b\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737079 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq5cf\" (UniqueName: \"kubernetes.io/projected/1919e18e-d914-4ee7-8bf4-6de02e6760c2-kube-api-access-qq5cf\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737091 4852 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1919e18e-d914-4ee7-8bf4-6de02e6760c2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737103 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737114 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737149 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4dd551-8252-43a4-b1b3-d4daf088ddd5-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737160 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr5p8\" (UniqueName: \"kubernetes.io/projected/031e1f57-c87c-4d8f-a05a-380efb0979ec-kube-api-access-hr5p8\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.737562 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmxbw"] Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.743485 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xmxbw"] Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.757633 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "031e1f57-c87c-4d8f-a05a-380efb0979ec" (UID: "031e1f57-c87c-4d8f-a05a-380efb0979ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 11:58:38 crc kubenswrapper[4852]: I1210 11:58:38.838773 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/031e1f57-c87c-4d8f-a05a-380efb0979ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.425094 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn5tj" event={"ID":"031e1f57-c87c-4d8f-a05a-380efb0979ec","Type":"ContainerDied","Data":"776a900a85ae063cd3fd29f8eda40b793807679965bfb12afab3cce63fbb3874"} Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.425138 4852 scope.go:117] "RemoveContainer" containerID="b8a9ba403f31399fedaa5a98dcfc33c4dd6e5c72badf4910b85f4b875feac281" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.425216 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn5tj" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.433785 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mqmsb" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.433785 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mqmsb" event={"ID":"aa4dd551-8252-43a4-b1b3-d4daf088ddd5","Type":"ContainerDied","Data":"c59853b6a0f45aeca71e3ccfd259fb6b8e366b07e7be2bcc8c23c9a0538ed0ea"} Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.435310 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" event={"ID":"1919e18e-d914-4ee7-8bf4-6de02e6760c2","Type":"ContainerDied","Data":"c028b57be69aaf2caf0c7472f5e848ccec5b480717ad0d03260d95acb826af5c"} Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.435399 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.443461 4852 scope.go:117] "RemoveContainer" containerID="f3dc9326df04b4e8e1aa54ba578e2bf73788f9561e0ec9741ebd4a9486add6bb" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.445641 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-64m8g" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.460801 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn5tj"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.464044 4852 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4cv5l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.464102 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4cv5l" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.464601 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cn5tj"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.469562 4852 scope.go:117] "RemoveContainer" containerID="4ec937935dc93cbfb6ee95c4d0242123a54660befb64c511bb947add052b7d08" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.497956 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nbwq2"] Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498196 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="extract-content" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498210 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="extract-content" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498221 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="extract-utilities" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498242 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="extract-utilities" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498254 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerName="extract-content" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498260 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerName="extract-content" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498270 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="extract-utilities" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498276 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="extract-utilities" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498285 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498291 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498298 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498304 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498311 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498317 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498325 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="extract-utilities" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498331 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="extract-utilities" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498339 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498347 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498355 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498363 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498372 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="extract-content" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498380 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="extract-content" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498390 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498396 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498403 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerName="extract-utilities" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498411 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerName="extract-utilities" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498422 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="extract-content" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498428 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="extract-content" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498509 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498518 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65cc728-9de3-466f-902b-47f30708118c" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498528 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498536 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f614760-033c-494e-81d4-11c997e0db34" containerName="registry-server" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498543 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498551 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498558 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: E1210 11:58:39.498643 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.498688 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" containerName="marketplace-operator" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.499304 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.506288 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.507885 4852 scope.go:117] "RemoveContainer" containerID="ea25e045b5147605de8722933f1facb963998074a7b0e8727ce6cbf512b3e29f" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.509826 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4cv5l"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.514634 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4cv5l"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.518437 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nbwq2"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.524571 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqmsb"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.535223 4852 scope.go:117] "RemoveContainer" containerID="5b9636d606893116b2f4d06610d889aff1cd3d43d5b49c70783fef222c934c13" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.537270 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mqmsb"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.546248 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-catalog-content\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.547108 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsbjb\" (UniqueName: \"kubernetes.io/projected/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-kube-api-access-nsbjb\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.547364 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-utilities\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.556561 4852 scope.go:117] "RemoveContainer" containerID="3ae0d1927db387b3f221b0f878ada285ad5efc1b0b644e43187477463b7238e6" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.575109 4852 scope.go:117] "RemoveContainer" containerID="354da5458a6e60aebf6ae46f7fbeb789f20298d4997e8b6637776e045edc2a74" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.650767 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-catalog-content\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.651197 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsbjb\" (UniqueName: \"kubernetes.io/projected/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-kube-api-access-nsbjb\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.651253 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-catalog-content\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.651258 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-utilities\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.651508 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-utilities\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.672255 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsbjb\" (UniqueName: \"kubernetes.io/projected/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-kube-api-access-nsbjb\") pod \"certified-operators-nbwq2\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.698093 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5hzpk"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.699038 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.702468 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.709290 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5hzpk"] Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.752968 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st7lw\" (UniqueName: \"kubernetes.io/projected/8ad77dbc-86d2-4bbc-8312-4529077f52a6-kube-api-access-st7lw\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.753041 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-utilities\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.753154 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-catalog-content\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.840912 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.854762 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-catalog-content\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.854832 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st7lw\" (UniqueName: \"kubernetes.io/projected/8ad77dbc-86d2-4bbc-8312-4529077f52a6-kube-api-access-st7lw\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.854861 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-utilities\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.856541 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-catalog-content\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.856662 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-utilities\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:39 crc kubenswrapper[4852]: I1210 11:58:39.874785 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st7lw\" (UniqueName: \"kubernetes.io/projected/8ad77dbc-86d2-4bbc-8312-4529077f52a6-kube-api-access-st7lw\") pod \"community-operators-5hzpk\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.016252 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.179109 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031e1f57-c87c-4d8f-a05a-380efb0979ec" path="/var/lib/kubelet/pods/031e1f57-c87c-4d8f-a05a-380efb0979ec/volumes" Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.180263 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1919e18e-d914-4ee7-8bf4-6de02e6760c2" path="/var/lib/kubelet/pods/1919e18e-d914-4ee7-8bf4-6de02e6760c2/volumes" Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.180955 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f614760-033c-494e-81d4-11c997e0db34" path="/var/lib/kubelet/pods/2f614760-033c-494e-81d4-11c997e0db34/volumes" Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.182456 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4dd551-8252-43a4-b1b3-d4daf088ddd5" path="/var/lib/kubelet/pods/aa4dd551-8252-43a4-b1b3-d4daf088ddd5/volumes" Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.183503 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65cc728-9de3-466f-902b-47f30708118c" path="/var/lib/kubelet/pods/b65cc728-9de3-466f-902b-47f30708118c/volumes" Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.208608 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nbwq2"] Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.449553 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5hzpk"] Dec 10 11:58:40 crc kubenswrapper[4852]: I1210 11:58:40.457112 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbwq2" event={"ID":"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d","Type":"ContainerStarted","Data":"82658192db401aa6d526c1964ce59171b12a52e47fb8af0223390e04b22da256"} Dec 10 11:58:40 crc kubenswrapper[4852]: W1210 11:58:40.459901 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad77dbc_86d2_4bbc_8312_4529077f52a6.slice/crio-84b991bb12ac88b8b4d8e4b407a4f4f4652cb08c304d9ff76ffdd3d40d7a06c1 WatchSource:0}: Error finding container 84b991bb12ac88b8b4d8e4b407a4f4f4652cb08c304d9ff76ffdd3d40d7a06c1: Status 404 returned error can't find the container with id 84b991bb12ac88b8b4d8e4b407a4f4f4652cb08c304d9ff76ffdd3d40d7a06c1 Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.466927 4852 generic.go:334] "Generic (PLEG): container finished" podID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerID="416afe29d58a33c6f40f8a87090ea3f2e86be86c58cad05159685bb11fb1a8ab" exitCode=0 Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.466988 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbwq2" event={"ID":"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d","Type":"ContainerDied","Data":"416afe29d58a33c6f40f8a87090ea3f2e86be86c58cad05159685bb11fb1a8ab"} Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.468855 4852 generic.go:334] "Generic (PLEG): container finished" podID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerID="bb87a903ac3b40e866fe731124acdef655eadf68094f8cc7cfb4868f72b7ddf1" exitCode=0 Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.468894 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hzpk" event={"ID":"8ad77dbc-86d2-4bbc-8312-4529077f52a6","Type":"ContainerDied","Data":"bb87a903ac3b40e866fe731124acdef655eadf68094f8cc7cfb4868f72b7ddf1"} Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.468919 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hzpk" event={"ID":"8ad77dbc-86d2-4bbc-8312-4529077f52a6","Type":"ContainerStarted","Data":"84b991bb12ac88b8b4d8e4b407a4f4f4652cb08c304d9ff76ffdd3d40d7a06c1"} Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.469952 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.895337 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fqx2n"] Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.896707 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.899029 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.907700 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqx2n"] Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.989092 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc96b426-5b87-4797-91be-ae9864b34b82-catalog-content\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.989185 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc96b426-5b87-4797-91be-ae9864b34b82-utilities\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:41 crc kubenswrapper[4852]: I1210 11:58:41.989304 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsqbr\" (UniqueName: \"kubernetes.io/projected/fc96b426-5b87-4797-91be-ae9864b34b82-kube-api-access-bsqbr\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.093830 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc96b426-5b87-4797-91be-ae9864b34b82-catalog-content\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.093936 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc96b426-5b87-4797-91be-ae9864b34b82-utilities\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.094191 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsqbr\" (UniqueName: \"kubernetes.io/projected/fc96b426-5b87-4797-91be-ae9864b34b82-kube-api-access-bsqbr\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.094455 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc96b426-5b87-4797-91be-ae9864b34b82-catalog-content\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.094790 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc96b426-5b87-4797-91be-ae9864b34b82-utilities\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.096031 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k4gn7"] Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.097143 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.099656 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.108017 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4gn7"] Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.117930 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsqbr\" (UniqueName: \"kubernetes.io/projected/fc96b426-5b87-4797-91be-ae9864b34b82-kube-api-access-bsqbr\") pod \"redhat-marketplace-fqx2n\" (UID: \"fc96b426-5b87-4797-91be-ae9864b34b82\") " pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.199874 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64eaac3d-ea36-4ea6-90dd-0b376a897f27-utilities\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.200246 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64eaac3d-ea36-4ea6-90dd-0b376a897f27-catalog-content\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.200380 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wln6w\" (UniqueName: \"kubernetes.io/projected/64eaac3d-ea36-4ea6-90dd-0b376a897f27-kube-api-access-wln6w\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.233997 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.301982 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64eaac3d-ea36-4ea6-90dd-0b376a897f27-utilities\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.302038 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64eaac3d-ea36-4ea6-90dd-0b376a897f27-catalog-content\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.302107 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wln6w\" (UniqueName: \"kubernetes.io/projected/64eaac3d-ea36-4ea6-90dd-0b376a897f27-kube-api-access-wln6w\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.302752 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64eaac3d-ea36-4ea6-90dd-0b376a897f27-utilities\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.302752 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64eaac3d-ea36-4ea6-90dd-0b376a897f27-catalog-content\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.318818 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wln6w\" (UniqueName: \"kubernetes.io/projected/64eaac3d-ea36-4ea6-90dd-0b376a897f27-kube-api-access-wln6w\") pod \"redhat-operators-k4gn7\" (UID: \"64eaac3d-ea36-4ea6-90dd-0b376a897f27\") " pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:42 crc kubenswrapper[4852]: I1210 11:58:42.419010 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:43 crc kubenswrapper[4852]: I1210 11:58:43.635199 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4gn7"] Dec 10 11:58:43 crc kubenswrapper[4852]: W1210 11:58:43.639901 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64eaac3d_ea36_4ea6_90dd_0b376a897f27.slice/crio-c90a3827dec53a08eaef33642794dcef1cf26e2d274ed44ffdf5830d6cc00b06 WatchSource:0}: Error finding container c90a3827dec53a08eaef33642794dcef1cf26e2d274ed44ffdf5830d6cc00b06: Status 404 returned error can't find the container with id c90a3827dec53a08eaef33642794dcef1cf26e2d274ed44ffdf5830d6cc00b06 Dec 10 11:58:43 crc kubenswrapper[4852]: I1210 11:58:43.684790 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqx2n"] Dec 10 11:58:43 crc kubenswrapper[4852]: W1210 11:58:43.695734 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc96b426_5b87_4797_91be_ae9864b34b82.slice/crio-fbd446232ca9ab5d6fe56e21263bdc4f6e22a494359242e2b9c5c06adf72dd5f WatchSource:0}: Error finding container fbd446232ca9ab5d6fe56e21263bdc4f6e22a494359242e2b9c5c06adf72dd5f: Status 404 returned error can't find the container with id fbd446232ca9ab5d6fe56e21263bdc4f6e22a494359242e2b9c5c06adf72dd5f Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.490323 4852 generic.go:334] "Generic (PLEG): container finished" podID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerID="2cb4645019933d1bd04ea9ca34b160e0e88891cbe74d3362f93f60aa9fae30e3" exitCode=0 Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.490505 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbwq2" event={"ID":"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d","Type":"ContainerDied","Data":"2cb4645019933d1bd04ea9ca34b160e0e88891cbe74d3362f93f60aa9fae30e3"} Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.493079 4852 generic.go:334] "Generic (PLEG): container finished" podID="64eaac3d-ea36-4ea6-90dd-0b376a897f27" containerID="7908e75979b786fc48f47cb7eb8de4b2e0292735bb25dc02ff9d2a828a509222" exitCode=0 Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.493160 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gn7" event={"ID":"64eaac3d-ea36-4ea6-90dd-0b376a897f27","Type":"ContainerDied","Data":"7908e75979b786fc48f47cb7eb8de4b2e0292735bb25dc02ff9d2a828a509222"} Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.493265 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gn7" event={"ID":"64eaac3d-ea36-4ea6-90dd-0b376a897f27","Type":"ContainerStarted","Data":"c90a3827dec53a08eaef33642794dcef1cf26e2d274ed44ffdf5830d6cc00b06"} Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.502010 4852 generic.go:334] "Generic (PLEG): container finished" podID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerID="52f2ce02c159f10ea2500d439f1daf0d6832450de3ed16ae251b11c99cf525a8" exitCode=0 Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.502163 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hzpk" event={"ID":"8ad77dbc-86d2-4bbc-8312-4529077f52a6","Type":"ContainerDied","Data":"52f2ce02c159f10ea2500d439f1daf0d6832450de3ed16ae251b11c99cf525a8"} Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.508865 4852 generic.go:334] "Generic (PLEG): container finished" podID="fc96b426-5b87-4797-91be-ae9864b34b82" containerID="d9e908203e0b85589163eb95b18d15c4b56091aac679ba432aafec7e73ff6b0d" exitCode=0 Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.508924 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqx2n" event={"ID":"fc96b426-5b87-4797-91be-ae9864b34b82","Type":"ContainerDied","Data":"d9e908203e0b85589163eb95b18d15c4b56091aac679ba432aafec7e73ff6b0d"} Dec 10 11:58:44 crc kubenswrapper[4852]: I1210 11:58:44.508956 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqx2n" event={"ID":"fc96b426-5b87-4797-91be-ae9864b34b82","Type":"ContainerStarted","Data":"fbd446232ca9ab5d6fe56e21263bdc4f6e22a494359242e2b9c5c06adf72dd5f"} Dec 10 11:58:45 crc kubenswrapper[4852]: I1210 11:58:45.518505 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbwq2" event={"ID":"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d","Type":"ContainerStarted","Data":"9dcf5a7658bdef305e82063b4011a5f908ccb533e4139a2f372225399dd50dd4"} Dec 10 11:58:45 crc kubenswrapper[4852]: I1210 11:58:45.539196 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nbwq2" podStartSLOduration=3.056053313 podStartE2EDuration="6.539179356s" podCreationTimestamp="2025-12-10 11:58:39 +0000 UTC" firstStartedPulling="2025-12-10 11:58:41.469751249 +0000 UTC m=+407.555276473" lastFinishedPulling="2025-12-10 11:58:44.952877292 +0000 UTC m=+411.038402516" observedRunningTime="2025-12-10 11:58:45.535255505 +0000 UTC m=+411.620780739" watchObservedRunningTime="2025-12-10 11:58:45.539179356 +0000 UTC m=+411.624704580" Dec 10 11:58:45 crc kubenswrapper[4852]: I1210 11:58:45.790680 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 11:58:45 crc kubenswrapper[4852]: I1210 11:58:45.790758 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 11:58:46 crc kubenswrapper[4852]: I1210 11:58:46.532361 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqx2n" event={"ID":"fc96b426-5b87-4797-91be-ae9864b34b82","Type":"ContainerStarted","Data":"78e0499388fa5bd3cda92d21f595062342a1caea700ad6ad5a15208fd4de5dd4"} Dec 10 11:58:46 crc kubenswrapper[4852]: I1210 11:58:46.535461 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gn7" event={"ID":"64eaac3d-ea36-4ea6-90dd-0b376a897f27","Type":"ContainerStarted","Data":"fb486194a3ff6a8408638f6182de97c2b655929de655e6adc61e7d19ea862fcb"} Dec 10 11:58:47 crc kubenswrapper[4852]: I1210 11:58:47.546150 4852 generic.go:334] "Generic (PLEG): container finished" podID="64eaac3d-ea36-4ea6-90dd-0b376a897f27" containerID="fb486194a3ff6a8408638f6182de97c2b655929de655e6adc61e7d19ea862fcb" exitCode=0 Dec 10 11:58:47 crc kubenswrapper[4852]: I1210 11:58:47.546349 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gn7" event={"ID":"64eaac3d-ea36-4ea6-90dd-0b376a897f27","Type":"ContainerDied","Data":"fb486194a3ff6a8408638f6182de97c2b655929de655e6adc61e7d19ea862fcb"} Dec 10 11:58:47 crc kubenswrapper[4852]: I1210 11:58:47.555596 4852 generic.go:334] "Generic (PLEG): container finished" podID="fc96b426-5b87-4797-91be-ae9864b34b82" containerID="78e0499388fa5bd3cda92d21f595062342a1caea700ad6ad5a15208fd4de5dd4" exitCode=0 Dec 10 11:58:47 crc kubenswrapper[4852]: I1210 11:58:47.555938 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqx2n" event={"ID":"fc96b426-5b87-4797-91be-ae9864b34b82","Type":"ContainerDied","Data":"78e0499388fa5bd3cda92d21f595062342a1caea700ad6ad5a15208fd4de5dd4"} Dec 10 11:58:48 crc kubenswrapper[4852]: I1210 11:58:48.566752 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hzpk" event={"ID":"8ad77dbc-86d2-4bbc-8312-4529077f52a6","Type":"ContainerStarted","Data":"00a8ee84414d06ebc9ce391f216a8b84b14392d6af85268353c96a346f29093f"} Dec 10 11:58:48 crc kubenswrapper[4852]: I1210 11:58:48.586529 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5hzpk" podStartSLOduration=4.473436531 podStartE2EDuration="9.586507011s" podCreationTimestamp="2025-12-10 11:58:39 +0000 UTC" firstStartedPulling="2025-12-10 11:58:41.470357885 +0000 UTC m=+407.555883109" lastFinishedPulling="2025-12-10 11:58:46.583428365 +0000 UTC m=+412.668953589" observedRunningTime="2025-12-10 11:58:48.584762356 +0000 UTC m=+414.670287590" watchObservedRunningTime="2025-12-10 11:58:48.586507011 +0000 UTC m=+414.672032245" Dec 10 11:58:49 crc kubenswrapper[4852]: I1210 11:58:49.577569 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gn7" event={"ID":"64eaac3d-ea36-4ea6-90dd-0b376a897f27","Type":"ContainerStarted","Data":"91ce4190fdb7c6479a1fac14dd03719cdd5c6cbb1e07ad253b3b54abfb02d440"} Dec 10 11:58:49 crc kubenswrapper[4852]: I1210 11:58:49.599149 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k4gn7" podStartSLOduration=3.64677754 podStartE2EDuration="7.599127689s" podCreationTimestamp="2025-12-10 11:58:42 +0000 UTC" firstStartedPulling="2025-12-10 11:58:44.494686619 +0000 UTC m=+410.580211843" lastFinishedPulling="2025-12-10 11:58:48.447036768 +0000 UTC m=+414.532561992" observedRunningTime="2025-12-10 11:58:49.595128816 +0000 UTC m=+415.680654040" watchObservedRunningTime="2025-12-10 11:58:49.599127689 +0000 UTC m=+415.684652913" Dec 10 11:58:49 crc kubenswrapper[4852]: I1210 11:58:49.842081 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:49 crc kubenswrapper[4852]: I1210 11:58:49.842140 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:49 crc kubenswrapper[4852]: I1210 11:58:49.888647 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:50 crc kubenswrapper[4852]: I1210 11:58:50.016742 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:50 crc kubenswrapper[4852]: I1210 11:58:50.016818 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:58:50 crc kubenswrapper[4852]: I1210 11:58:50.661553 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 11:58:51 crc kubenswrapper[4852]: I1210 11:58:51.057352 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5hzpk" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="registry-server" probeResult="failure" output=< Dec 10 11:58:51 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 11:58:51 crc kubenswrapper[4852]: > Dec 10 11:58:51 crc kubenswrapper[4852]: I1210 11:58:51.603527 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqx2n" event={"ID":"fc96b426-5b87-4797-91be-ae9864b34b82","Type":"ContainerStarted","Data":"03e1e023652e80f99dce9a0d201ba97867659c3b7a8a74724dd6d9516680f66e"} Dec 10 11:58:51 crc kubenswrapper[4852]: I1210 11:58:51.619020 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fqx2n" podStartSLOduration=3.91854509 podStartE2EDuration="10.618999086s" podCreationTimestamp="2025-12-10 11:58:41 +0000 UTC" firstStartedPulling="2025-12-10 11:58:44.512820635 +0000 UTC m=+410.598345859" lastFinishedPulling="2025-12-10 11:58:51.213274631 +0000 UTC m=+417.298799855" observedRunningTime="2025-12-10 11:58:51.616732457 +0000 UTC m=+417.702257681" watchObservedRunningTime="2025-12-10 11:58:51.618999086 +0000 UTC m=+417.704524310" Dec 10 11:58:52 crc kubenswrapper[4852]: I1210 11:58:52.234801 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:52 crc kubenswrapper[4852]: I1210 11:58:52.235078 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:58:52 crc kubenswrapper[4852]: I1210 11:58:52.419190 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:52 crc kubenswrapper[4852]: I1210 11:58:52.419288 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:58:53 crc kubenswrapper[4852]: I1210 11:58:53.281765 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-fqx2n" podUID="fc96b426-5b87-4797-91be-ae9864b34b82" containerName="registry-server" probeResult="failure" output=< Dec 10 11:58:53 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 11:58:53 crc kubenswrapper[4852]: > Dec 10 11:58:53 crc kubenswrapper[4852]: I1210 11:58:53.459634 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k4gn7" podUID="64eaac3d-ea36-4ea6-90dd-0b376a897f27" containerName="registry-server" probeResult="failure" output=< Dec 10 11:58:53 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 11:58:53 crc kubenswrapper[4852]: > Dec 10 11:59:00 crc kubenswrapper[4852]: I1210 11:59:00.071044 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:59:00 crc kubenswrapper[4852]: I1210 11:59:00.116642 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5hzpk" Dec 10 11:59:02 crc kubenswrapper[4852]: I1210 11:59:02.277716 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:59:02 crc kubenswrapper[4852]: I1210 11:59:02.321716 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fqx2n" Dec 10 11:59:02 crc kubenswrapper[4852]: I1210 11:59:02.463894 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:59:02 crc kubenswrapper[4852]: I1210 11:59:02.503390 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k4gn7" Dec 10 11:59:15 crc kubenswrapper[4852]: I1210 11:59:15.790698 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 11:59:15 crc kubenswrapper[4852]: I1210 11:59:15.791284 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 11:59:15 crc kubenswrapper[4852]: I1210 11:59:15.791331 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 11:59:15 crc kubenswrapper[4852]: I1210 11:59:15.792150 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81155a692a32264fc4c153e9736724fd23b0a15aa61c032f0b091089d2a44202"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 11:59:15 crc kubenswrapper[4852]: I1210 11:59:15.792258 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://81155a692a32264fc4c153e9736724fd23b0a15aa61c032f0b091089d2a44202" gracePeriod=600 Dec 10 11:59:16 crc kubenswrapper[4852]: I1210 11:59:16.738020 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="81155a692a32264fc4c153e9736724fd23b0a15aa61c032f0b091089d2a44202" exitCode=0 Dec 10 11:59:16 crc kubenswrapper[4852]: I1210 11:59:16.738145 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"81155a692a32264fc4c153e9736724fd23b0a15aa61c032f0b091089d2a44202"} Dec 10 11:59:16 crc kubenswrapper[4852]: I1210 11:59:16.738386 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"5f347330cd2d5cdf86ff9446a444fd62f87cac15b734887bf743393441452d4f"} Dec 10 11:59:16 crc kubenswrapper[4852]: I1210 11:59:16.738414 4852 scope.go:117] "RemoveContainer" containerID="df5387583f66f93b26a76954748f69c02df08bb9c349c9c9465ec2fb73fa4fd0" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.182400 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd"] Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.185330 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.188153 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd"] Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.188176 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.189174 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.244670 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkv6r\" (UniqueName: \"kubernetes.io/projected/3203c694-b4b0-43d8-a728-8a0804346e1c-kube-api-access-kkv6r\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.244728 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3203c694-b4b0-43d8-a728-8a0804346e1c-config-volume\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.244753 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3203c694-b4b0-43d8-a728-8a0804346e1c-secret-volume\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.345377 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkv6r\" (UniqueName: \"kubernetes.io/projected/3203c694-b4b0-43d8-a728-8a0804346e1c-kube-api-access-kkv6r\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.345456 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3203c694-b4b0-43d8-a728-8a0804346e1c-config-volume\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.345488 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3203c694-b4b0-43d8-a728-8a0804346e1c-secret-volume\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.346823 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3203c694-b4b0-43d8-a728-8a0804346e1c-config-volume\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.355846 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3203c694-b4b0-43d8-a728-8a0804346e1c-secret-volume\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.366969 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkv6r\" (UniqueName: \"kubernetes.io/projected/3203c694-b4b0-43d8-a728-8a0804346e1c-kube-api-access-kkv6r\") pod \"collect-profiles-29422800-w87qd\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.502584 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.721008 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd"] Dec 10 12:00:00 crc kubenswrapper[4852]: I1210 12:00:00.986367 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" event={"ID":"3203c694-b4b0-43d8-a728-8a0804346e1c","Type":"ContainerStarted","Data":"bd0cd5b7d8361d317fec5cd1d1a93108511174bce8050947a9705ba8f2a9f8cd"} Dec 10 12:00:01 crc kubenswrapper[4852]: I1210 12:00:01.995175 4852 generic.go:334] "Generic (PLEG): container finished" podID="3203c694-b4b0-43d8-a728-8a0804346e1c" containerID="2f4e481241117b39520e59cda5b229c4110bab117172284f52ea06f0aadd3a7c" exitCode=0 Dec 10 12:00:01 crc kubenswrapper[4852]: I1210 12:00:01.995556 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" event={"ID":"3203c694-b4b0-43d8-a728-8a0804346e1c","Type":"ContainerDied","Data":"2f4e481241117b39520e59cda5b229c4110bab117172284f52ea06f0aadd3a7c"} Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.221599 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.390517 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3203c694-b4b0-43d8-a728-8a0804346e1c-secret-volume\") pod \"3203c694-b4b0-43d8-a728-8a0804346e1c\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.390653 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3203c694-b4b0-43d8-a728-8a0804346e1c-config-volume\") pod \"3203c694-b4b0-43d8-a728-8a0804346e1c\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.390716 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkv6r\" (UniqueName: \"kubernetes.io/projected/3203c694-b4b0-43d8-a728-8a0804346e1c-kube-api-access-kkv6r\") pod \"3203c694-b4b0-43d8-a728-8a0804346e1c\" (UID: \"3203c694-b4b0-43d8-a728-8a0804346e1c\") " Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.391478 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3203c694-b4b0-43d8-a728-8a0804346e1c-config-volume" (OuterVolumeSpecName: "config-volume") pod "3203c694-b4b0-43d8-a728-8a0804346e1c" (UID: "3203c694-b4b0-43d8-a728-8a0804346e1c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.396501 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3203c694-b4b0-43d8-a728-8a0804346e1c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3203c694-b4b0-43d8-a728-8a0804346e1c" (UID: "3203c694-b4b0-43d8-a728-8a0804346e1c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.397416 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3203c694-b4b0-43d8-a728-8a0804346e1c-kube-api-access-kkv6r" (OuterVolumeSpecName: "kube-api-access-kkv6r") pod "3203c694-b4b0-43d8-a728-8a0804346e1c" (UID: "3203c694-b4b0-43d8-a728-8a0804346e1c"). InnerVolumeSpecName "kube-api-access-kkv6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.492852 4852 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3203c694-b4b0-43d8-a728-8a0804346e1c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.492928 4852 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3203c694-b4b0-43d8-a728-8a0804346e1c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 10 12:00:03 crc kubenswrapper[4852]: I1210 12:00:03.492942 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkv6r\" (UniqueName: \"kubernetes.io/projected/3203c694-b4b0-43d8-a728-8a0804346e1c-kube-api-access-kkv6r\") on node \"crc\" DevicePath \"\"" Dec 10 12:00:04 crc kubenswrapper[4852]: I1210 12:00:04.012513 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" event={"ID":"3203c694-b4b0-43d8-a728-8a0804346e1c","Type":"ContainerDied","Data":"bd0cd5b7d8361d317fec5cd1d1a93108511174bce8050947a9705ba8f2a9f8cd"} Dec 10 12:00:04 crc kubenswrapper[4852]: I1210 12:00:04.012585 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd0cd5b7d8361d317fec5cd1d1a93108511174bce8050947a9705ba8f2a9f8cd" Dec 10 12:00:04 crc kubenswrapper[4852]: I1210 12:00:04.012690 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd" Dec 10 12:01:45 crc kubenswrapper[4852]: I1210 12:01:45.791244 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:01:45 crc kubenswrapper[4852]: I1210 12:01:45.794149 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:02:15 crc kubenswrapper[4852]: I1210 12:02:15.790359 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:02:15 crc kubenswrapper[4852]: I1210 12:02:15.790867 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:02:45 crc kubenswrapper[4852]: I1210 12:02:45.790177 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:02:45 crc kubenswrapper[4852]: I1210 12:02:45.790899 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:02:45 crc kubenswrapper[4852]: I1210 12:02:45.790944 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:02:45 crc kubenswrapper[4852]: I1210 12:02:45.791423 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f347330cd2d5cdf86ff9446a444fd62f87cac15b734887bf743393441452d4f"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:02:45 crc kubenswrapper[4852]: I1210 12:02:45.791467 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://5f347330cd2d5cdf86ff9446a444fd62f87cac15b734887bf743393441452d4f" gracePeriod=600 Dec 10 12:02:45 crc kubenswrapper[4852]: I1210 12:02:45.946352 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="5f347330cd2d5cdf86ff9446a444fd62f87cac15b734887bf743393441452d4f" exitCode=0 Dec 10 12:02:45 crc kubenswrapper[4852]: I1210 12:02:45.946399 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"5f347330cd2d5cdf86ff9446a444fd62f87cac15b734887bf743393441452d4f"} Dec 10 12:02:45 crc kubenswrapper[4852]: I1210 12:02:45.946435 4852 scope.go:117] "RemoveContainer" containerID="81155a692a32264fc4c153e9736724fd23b0a15aa61c032f0b091089d2a44202" Dec 10 12:02:46 crc kubenswrapper[4852]: I1210 12:02:46.953664 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"77c211572bcee4c8a77c07da48869683ba7551ebec91c3aa4c5542663748ddba"} Dec 10 12:03:12 crc kubenswrapper[4852]: I1210 12:03:12.969178 4852 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.594327 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-z86h2"] Dec 10 12:04:30 crc kubenswrapper[4852]: E1210 12:04:30.595569 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3203c694-b4b0-43d8-a728-8a0804346e1c" containerName="collect-profiles" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.595587 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3203c694-b4b0-43d8-a728-8a0804346e1c" containerName="collect-profiles" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.595731 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="3203c694-b4b0-43d8-a728-8a0804346e1c" containerName="collect-profiles" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.596336 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-z86h2" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.601485 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.601851 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.602278 4852 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bqmbv" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.607459 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nt4zl"] Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.608106 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nt4zl" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.613476 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-z86h2"] Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.616884 4852 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-p2q2p" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.626298 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rj8fh"] Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.627311 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.632096 4852 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-lglbr" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.634856 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nt4zl"] Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.644108 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rj8fh"] Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.783387 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2w4g\" (UniqueName: \"kubernetes.io/projected/bb5429e3-7f2e-4632-b68b-18de65b5e060-kube-api-access-g2w4g\") pod \"cert-manager-5b446d88c5-nt4zl\" (UID: \"bb5429e3-7f2e-4632-b68b-18de65b5e060\") " pod="cert-manager/cert-manager-5b446d88c5-nt4zl" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.783468 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nxl2\" (UniqueName: \"kubernetes.io/projected/dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4-kube-api-access-2nxl2\") pod \"cert-manager-cainjector-7f985d654d-z86h2\" (UID: \"dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-z86h2" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.783497 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjls6\" (UniqueName: \"kubernetes.io/projected/78c79d4e-6293-4789-932e-2c42545750a5-kube-api-access-qjls6\") pod \"cert-manager-webhook-5655c58dd6-rj8fh\" (UID: \"78c79d4e-6293-4789-932e-2c42545750a5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.884152 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2w4g\" (UniqueName: \"kubernetes.io/projected/bb5429e3-7f2e-4632-b68b-18de65b5e060-kube-api-access-g2w4g\") pod \"cert-manager-5b446d88c5-nt4zl\" (UID: \"bb5429e3-7f2e-4632-b68b-18de65b5e060\") " pod="cert-manager/cert-manager-5b446d88c5-nt4zl" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.884364 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nxl2\" (UniqueName: \"kubernetes.io/projected/dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4-kube-api-access-2nxl2\") pod \"cert-manager-cainjector-7f985d654d-z86h2\" (UID: \"dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-z86h2" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.884396 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjls6\" (UniqueName: \"kubernetes.io/projected/78c79d4e-6293-4789-932e-2c42545750a5-kube-api-access-qjls6\") pod \"cert-manager-webhook-5655c58dd6-rj8fh\" (UID: \"78c79d4e-6293-4789-932e-2c42545750a5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.908935 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nxl2\" (UniqueName: \"kubernetes.io/projected/dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4-kube-api-access-2nxl2\") pod \"cert-manager-cainjector-7f985d654d-z86h2\" (UID: \"dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-z86h2" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.909057 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjls6\" (UniqueName: \"kubernetes.io/projected/78c79d4e-6293-4789-932e-2c42545750a5-kube-api-access-qjls6\") pod \"cert-manager-webhook-5655c58dd6-rj8fh\" (UID: \"78c79d4e-6293-4789-932e-2c42545750a5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.911764 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2w4g\" (UniqueName: \"kubernetes.io/projected/bb5429e3-7f2e-4632-b68b-18de65b5e060-kube-api-access-g2w4g\") pod \"cert-manager-5b446d88c5-nt4zl\" (UID: \"bb5429e3-7f2e-4632-b68b-18de65b5e060\") " pod="cert-manager/cert-manager-5b446d88c5-nt4zl" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.932661 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-z86h2" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.944575 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nt4zl" Dec 10 12:04:30 crc kubenswrapper[4852]: I1210 12:04:30.955263 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" Dec 10 12:04:31 crc kubenswrapper[4852]: I1210 12:04:31.197487 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nt4zl"] Dec 10 12:04:31 crc kubenswrapper[4852]: I1210 12:04:31.207094 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:04:31 crc kubenswrapper[4852]: I1210 12:04:31.247165 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-z86h2"] Dec 10 12:04:31 crc kubenswrapper[4852]: I1210 12:04:31.299624 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rj8fh"] Dec 10 12:04:31 crc kubenswrapper[4852]: W1210 12:04:31.303678 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78c79d4e_6293_4789_932e_2c42545750a5.slice/crio-5aec94c04f820142e33d467e46d2e66ea1d0a8a8245ddb665721a1398148509d WatchSource:0}: Error finding container 5aec94c04f820142e33d467e46d2e66ea1d0a8a8245ddb665721a1398148509d: Status 404 returned error can't find the container with id 5aec94c04f820142e33d467e46d2e66ea1d0a8a8245ddb665721a1398148509d Dec 10 12:04:31 crc kubenswrapper[4852]: I1210 12:04:31.498182 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-z86h2" event={"ID":"dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4","Type":"ContainerStarted","Data":"907ee30153103e0266a0a0790da69bab662c1454b4eb2799eab65d58c760166d"} Dec 10 12:04:31 crc kubenswrapper[4852]: I1210 12:04:31.501380 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" event={"ID":"78c79d4e-6293-4789-932e-2c42545750a5","Type":"ContainerStarted","Data":"5aec94c04f820142e33d467e46d2e66ea1d0a8a8245ddb665721a1398148509d"} Dec 10 12:04:31 crc kubenswrapper[4852]: I1210 12:04:31.502520 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nt4zl" event={"ID":"bb5429e3-7f2e-4632-b68b-18de65b5e060","Type":"ContainerStarted","Data":"358618a23d03142910d5237feb74fcbdf4d01c7a0ca62e2f33d561a6654693f6"} Dec 10 12:04:34 crc kubenswrapper[4852]: I1210 12:04:34.521671 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nt4zl" event={"ID":"bb5429e3-7f2e-4632-b68b-18de65b5e060","Type":"ContainerStarted","Data":"3e8ba26f65003d2d40c303720a1940973d7e735d55ebc21e697ab68f25540ef9"} Dec 10 12:04:34 crc kubenswrapper[4852]: I1210 12:04:34.535732 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-nt4zl" podStartSLOduration=1.758935739 podStartE2EDuration="4.535680135s" podCreationTimestamp="2025-12-10 12:04:30 +0000 UTC" firstStartedPulling="2025-12-10 12:04:31.206813516 +0000 UTC m=+757.292338740" lastFinishedPulling="2025-12-10 12:04:33.983557922 +0000 UTC m=+760.069083136" observedRunningTime="2025-12-10 12:04:34.534160907 +0000 UTC m=+760.619686131" watchObservedRunningTime="2025-12-10 12:04:34.535680135 +0000 UTC m=+760.621205359" Dec 10 12:04:36 crc kubenswrapper[4852]: I1210 12:04:36.536950 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-z86h2" event={"ID":"dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4","Type":"ContainerStarted","Data":"942934ba7135301f36e9b7ae152df64e31ae8010003f85127921ea03347bb3d9"} Dec 10 12:04:36 crc kubenswrapper[4852]: I1210 12:04:36.539418 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" event={"ID":"78c79d4e-6293-4789-932e-2c42545750a5","Type":"ContainerStarted","Data":"b24cd9e405c26829d1a170ab0bbe4c6381443e65a771aded212df58a8f8d5d38"} Dec 10 12:04:36 crc kubenswrapper[4852]: I1210 12:04:36.539576 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" Dec 10 12:04:36 crc kubenswrapper[4852]: I1210 12:04:36.563627 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-z86h2" podStartSLOduration=2.324935209 podStartE2EDuration="6.563589488s" podCreationTimestamp="2025-12-10 12:04:30 +0000 UTC" firstStartedPulling="2025-12-10 12:04:31.255832605 +0000 UTC m=+757.341357829" lastFinishedPulling="2025-12-10 12:04:35.494486884 +0000 UTC m=+761.580012108" observedRunningTime="2025-12-10 12:04:36.558263544 +0000 UTC m=+762.643788788" watchObservedRunningTime="2025-12-10 12:04:36.563589488 +0000 UTC m=+762.649114722" Dec 10 12:04:36 crc kubenswrapper[4852]: I1210 12:04:36.585078 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" podStartSLOduration=2.322233282 podStartE2EDuration="6.585044576s" podCreationTimestamp="2025-12-10 12:04:30 +0000 UTC" firstStartedPulling="2025-12-10 12:04:31.305655864 +0000 UTC m=+757.391181088" lastFinishedPulling="2025-12-10 12:04:35.568467158 +0000 UTC m=+761.653992382" observedRunningTime="2025-12-10 12:04:36.572580053 +0000 UTC m=+762.658105277" watchObservedRunningTime="2025-12-10 12:04:36.585044576 +0000 UTC m=+762.670569810" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.239263 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-89m87"] Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.240417 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovn-controller" containerID="cri-o://0e72f2311dc0b72b226ef1f9b878c8cdd81b928d1157cad951ad4bbe6117f79f" gracePeriod=30 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.240555 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovn-acl-logging" containerID="cri-o://e2e4371fec6226d65683a3a8c8b2f400e69e6f0881a7b801f3c06d62256332df" gracePeriod=30 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.240541 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="nbdb" containerID="cri-o://7628905c2e633e97f2fe958e4211f5b02890b8fe5105b17966eb92658c9b9012" gracePeriod=30 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.240598 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kube-rbac-proxy-node" containerID="cri-o://986c5d5945116451836d213d41a0bd862b4c52bfa735a9d983424a109980a3c4" gracePeriod=30 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.240627 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="northd" containerID="cri-o://6901027836a6617e3b89c30b0d07101b2e21384249f3b74c9f39334e3aeab0e6" gracePeriod=30 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.240665 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a1028602c7fa4ce561b0d24d016dd49368a7bc7a4d7f6c4cd21254651b84eb77" gracePeriod=30 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.240653 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="sbdb" containerID="cri-o://56bc33d68cf0b28a9bda8ec77881ce204966237302fdb28f2d71601eccd79938" gracePeriod=30 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.282651 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovnkube-controller" containerID="cri-o://82f8dd56768dbe144d4e28db2d0938df758f701faadc338df75e9fb7c3a13f43" gracePeriod=30 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.565430 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzcx9_7d01ef2d-58af-42c3-b716-9020614e2a09/kube-multus/0.log" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.565482 4852 generic.go:334] "Generic (PLEG): container finished" podID="7d01ef2d-58af-42c3-b716-9020614e2a09" containerID="181b8ac1ab6e03b4b0d6a5b4b56de675b0bf62b450ded27a34cacc5edfa7a57e" exitCode=2 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.565535 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzcx9" event={"ID":"7d01ef2d-58af-42c3-b716-9020614e2a09","Type":"ContainerDied","Data":"181b8ac1ab6e03b4b0d6a5b4b56de675b0bf62b450ded27a34cacc5edfa7a57e"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.565958 4852 scope.go:117] "RemoveContainer" containerID="181b8ac1ab6e03b4b0d6a5b4b56de675b0bf62b450ded27a34cacc5edfa7a57e" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.578359 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-89m87_64c17726-4529-4a16-9d1e-e7e40fa6055a/ovn-acl-logging/0.log" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.582252 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-89m87_64c17726-4529-4a16-9d1e-e7e40fa6055a/ovn-controller/0.log" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588426 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="82f8dd56768dbe144d4e28db2d0938df758f701faadc338df75e9fb7c3a13f43" exitCode=0 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588466 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="56bc33d68cf0b28a9bda8ec77881ce204966237302fdb28f2d71601eccd79938" exitCode=0 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588475 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="7628905c2e633e97f2fe958e4211f5b02890b8fe5105b17966eb92658c9b9012" exitCode=0 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588483 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="6901027836a6617e3b89c30b0d07101b2e21384249f3b74c9f39334e3aeab0e6" exitCode=0 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588492 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="a1028602c7fa4ce561b0d24d016dd49368a7bc7a4d7f6c4cd21254651b84eb77" exitCode=0 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588499 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="986c5d5945116451836d213d41a0bd862b4c52bfa735a9d983424a109980a3c4" exitCode=0 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588507 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="e2e4371fec6226d65683a3a8c8b2f400e69e6f0881a7b801f3c06d62256332df" exitCode=143 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588517 4852 generic.go:334] "Generic (PLEG): container finished" podID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerID="0e72f2311dc0b72b226ef1f9b878c8cdd81b928d1157cad951ad4bbe6117f79f" exitCode=143 Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588548 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"82f8dd56768dbe144d4e28db2d0938df758f701faadc338df75e9fb7c3a13f43"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588589 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"56bc33d68cf0b28a9bda8ec77881ce204966237302fdb28f2d71601eccd79938"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588601 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"7628905c2e633e97f2fe958e4211f5b02890b8fe5105b17966eb92658c9b9012"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588611 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"6901027836a6617e3b89c30b0d07101b2e21384249f3b74c9f39334e3aeab0e6"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588622 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"a1028602c7fa4ce561b0d24d016dd49368a7bc7a4d7f6c4cd21254651b84eb77"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588631 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"986c5d5945116451836d213d41a0bd862b4c52bfa735a9d983424a109980a3c4"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588639 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"e2e4371fec6226d65683a3a8c8b2f400e69e6f0881a7b801f3c06d62256332df"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.588648 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"0e72f2311dc0b72b226ef1f9b878c8cdd81b928d1157cad951ad4bbe6117f79f"} Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.623452 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-89m87_64c17726-4529-4a16-9d1e-e7e40fa6055a/ovn-acl-logging/0.log" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.623967 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-89m87_64c17726-4529-4a16-9d1e-e7e40fa6055a/ovn-controller/0.log" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.624556 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.702514 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d28qs"] Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703206 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="northd" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703226 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="northd" Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703257 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kubecfg-setup" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703265 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kubecfg-setup" Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703281 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovn-acl-logging" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703289 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovn-acl-logging" Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703300 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kube-rbac-proxy-ovn-metrics" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703309 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kube-rbac-proxy-ovn-metrics" Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703318 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kube-rbac-proxy-node" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703326 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kube-rbac-proxy-node" Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703340 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovn-controller" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703348 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovn-controller" Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703363 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="nbdb" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703372 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="nbdb" Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703382 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovnkube-controller" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703390 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovnkube-controller" Dec 10 12:04:41 crc kubenswrapper[4852]: E1210 12:04:41.703401 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="sbdb" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703408 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="sbdb" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703529 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="nbdb" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703547 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovn-controller" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703558 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovn-acl-logging" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703570 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="northd" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703581 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kube-rbac-proxy-ovn-metrics" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703589 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="ovnkube-controller" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703603 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="kube-rbac-proxy-node" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.703613 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" containerName="sbdb" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.705919 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738680 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-netd\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738745 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-config\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738773 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-var-lib-openvswitch\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738792 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-slash\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738817 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qgtq\" (UniqueName: \"kubernetes.io/projected/64c17726-4529-4a16-9d1e-e7e40fa6055a-kube-api-access-9qgtq\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738833 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-systemd-units\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738846 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-systemd\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738865 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-kubelet\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738882 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-openvswitch\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738921 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-ovn\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738936 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-node-log\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738954 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-ovn-kubernetes\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738973 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-bin\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.738988 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-etc-openvswitch\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739004 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovn-node-metrics-cert\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739033 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-script-lib\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739046 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-netns\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739066 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-log-socket\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739086 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-env-overrides\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739113 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"64c17726-4529-4a16-9d1e-e7e40fa6055a\" (UID: \"64c17726-4529-4a16-9d1e-e7e40fa6055a\") " Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739736 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-node-log" (OuterVolumeSpecName: "node-log") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739830 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.739839 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740066 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740164 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740196 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740373 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740515 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-log-socket" (OuterVolumeSpecName: "log-socket") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740537 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740562 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-slash" (OuterVolumeSpecName: "host-slash") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740843 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740873 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740912 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740962 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.740991 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.741460 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.741501 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.746988 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c17726-4529-4a16-9d1e-e7e40fa6055a-kube-api-access-9qgtq" (OuterVolumeSpecName: "kube-api-access-9qgtq") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "kube-api-access-9qgtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.747533 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.758947 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "64c17726-4529-4a16-9d1e-e7e40fa6055a" (UID: "64c17726-4529-4a16-9d1e-e7e40fa6055a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.840856 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-var-lib-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.841359 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.841439 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-cni-bin\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.841524 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f27876cc-b48e-4170-a167-8b4a3e393512-ovn-node-metrics-cert\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.841600 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.842141 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-cni-netd\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.842485 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhvs4\" (UniqueName: \"kubernetes.io/projected/f27876cc-b48e-4170-a167-8b4a3e393512-kube-api-access-hhvs4\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.842608 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-env-overrides\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.842693 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-kubelet\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.842759 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-systemd\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.842851 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-run-ovn-kubernetes\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.842970 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-slash\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843068 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-ovnkube-config\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843211 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-etc-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843291 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-node-log\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843313 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-ovnkube-script-lib\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843337 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-log-socket\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843395 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-ovn\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843434 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-systemd-units\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843471 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-run-netns\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843562 4852 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843580 4852 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843593 4852 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843610 4852 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843627 4852 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843643 4852 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-log-socket\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843657 4852 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843671 4852 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843686 4852 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843702 4852 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64c17726-4529-4a16-9d1e-e7e40fa6055a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843719 4852 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843731 4852 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-slash\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843742 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qgtq\" (UniqueName: \"kubernetes.io/projected/64c17726-4529-4a16-9d1e-e7e40fa6055a-kube-api-access-9qgtq\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843755 4852 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843766 4852 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843776 4852 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843785 4852 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843794 4852 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843805 4852 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-node-log\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.843815 4852 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64c17726-4529-4a16-9d1e-e7e40fa6055a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.944854 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-cni-netd\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.944946 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhvs4\" (UniqueName: \"kubernetes.io/projected/f27876cc-b48e-4170-a167-8b4a3e393512-kube-api-access-hhvs4\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.944979 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-env-overrides\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945001 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-cni-netd\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945039 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-kubelet\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945006 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-kubelet\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945079 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-systemd\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945106 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-run-ovn-kubernetes\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945138 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-slash\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945164 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-ovnkube-config\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945189 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-etc-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945215 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-node-log\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945256 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-ovnkube-script-lib\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945279 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-log-socket\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945309 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-ovn\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945316 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-run-ovn-kubernetes\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945365 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-systemd-units\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945339 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-systemd-units\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945381 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-systemd\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945463 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-ovn\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945468 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-etc-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945399 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-slash\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945412 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-run-netns\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945439 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-log-socket\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945513 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-node-log\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945437 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-run-netns\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945533 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-var-lib-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945563 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945591 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-cni-bin\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945623 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f27876cc-b48e-4170-a167-8b4a3e393512-ovn-node-metrics-cert\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945635 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945625 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-var-lib-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945722 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945649 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-run-openvswitch\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945772 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f27876cc-b48e-4170-a167-8b4a3e393512-host-cni-bin\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.945976 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-env-overrides\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.946214 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-ovnkube-script-lib\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.946171 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f27876cc-b48e-4170-a167-8b4a3e393512-ovnkube-config\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.950717 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f27876cc-b48e-4170-a167-8b4a3e393512-ovn-node-metrics-cert\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:41 crc kubenswrapper[4852]: I1210 12:04:41.961428 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhvs4\" (UniqueName: \"kubernetes.io/projected/f27876cc-b48e-4170-a167-8b4a3e393512-kube-api-access-hhvs4\") pod \"ovnkube-node-d28qs\" (UID: \"f27876cc-b48e-4170-a167-8b4a3e393512\") " pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.025391 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.597774 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mzcx9_7d01ef2d-58af-42c3-b716-9020614e2a09/kube-multus/0.log" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.598327 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mzcx9" event={"ID":"7d01ef2d-58af-42c3-b716-9020614e2a09","Type":"ContainerStarted","Data":"abf652e2e83a7380ab22ae2c61855ad64abb59c4bcf17850bdf6926be70803a4"} Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.600924 4852 generic.go:334] "Generic (PLEG): container finished" podID="f27876cc-b48e-4170-a167-8b4a3e393512" containerID="0dd5f04c994a6550290e0ed385aceb4b69a5352b7dedd258bc23b85f631bb329" exitCode=0 Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.601010 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerDied","Data":"0dd5f04c994a6550290e0ed385aceb4b69a5352b7dedd258bc23b85f631bb329"} Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.601080 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"fd74cccea6200882054995faf98500f4d6b5c8b344ea4cadcb5b95b6e73cfb49"} Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.606994 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-89m87_64c17726-4529-4a16-9d1e-e7e40fa6055a/ovn-acl-logging/0.log" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.608064 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-89m87_64c17726-4529-4a16-9d1e-e7e40fa6055a/ovn-controller/0.log" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.608690 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" event={"ID":"64c17726-4529-4a16-9d1e-e7e40fa6055a","Type":"ContainerDied","Data":"483383519e1fc6e0525f78e02db44a8c3028228410073467d5ced2664f093db6"} Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.608745 4852 scope.go:117] "RemoveContainer" containerID="82f8dd56768dbe144d4e28db2d0938df758f701faadc338df75e9fb7c3a13f43" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.608916 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-89m87" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.634940 4852 scope.go:117] "RemoveContainer" containerID="56bc33d68cf0b28a9bda8ec77881ce204966237302fdb28f2d71601eccd79938" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.681122 4852 scope.go:117] "RemoveContainer" containerID="7628905c2e633e97f2fe958e4211f5b02890b8fe5105b17966eb92658c9b9012" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.700716 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-89m87"] Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.709859 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-89m87"] Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.718638 4852 scope.go:117] "RemoveContainer" containerID="6901027836a6617e3b89c30b0d07101b2e21384249f3b74c9f39334e3aeab0e6" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.740726 4852 scope.go:117] "RemoveContainer" containerID="a1028602c7fa4ce561b0d24d016dd49368a7bc7a4d7f6c4cd21254651b84eb77" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.766162 4852 scope.go:117] "RemoveContainer" containerID="986c5d5945116451836d213d41a0bd862b4c52bfa735a9d983424a109980a3c4" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.783063 4852 scope.go:117] "RemoveContainer" containerID="e2e4371fec6226d65683a3a8c8b2f400e69e6f0881a7b801f3c06d62256332df" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.799766 4852 scope.go:117] "RemoveContainer" containerID="0e72f2311dc0b72b226ef1f9b878c8cdd81b928d1157cad951ad4bbe6117f79f" Dec 10 12:04:42 crc kubenswrapper[4852]: I1210 12:04:42.823212 4852 scope.go:117] "RemoveContainer" containerID="e5d7d4620ee981f00c1e19c2f2c5b22985fd554dfb2b8c689401b1fefc33ce13" Dec 10 12:04:43 crc kubenswrapper[4852]: I1210 12:04:43.621355 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"63f97f939b2485ca20787601c5c30f5a3d54a1e2fbcefa9afa9be26af0bc90af"} Dec 10 12:04:43 crc kubenswrapper[4852]: I1210 12:04:43.622007 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"95745a0e8aef6e7e2bd440275057d15d230442798fee84952c282e0146945a18"} Dec 10 12:04:43 crc kubenswrapper[4852]: I1210 12:04:43.622021 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"fabad8c98c36f1dbdaf2dce16555fae7ac11ed0f46d8f831511abd9cfd4f3306"} Dec 10 12:04:43 crc kubenswrapper[4852]: I1210 12:04:43.622030 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"232902d5d85319b787f4a5d37f4b482e7dd8c9b35504e92423fabcd21a6fb913"} Dec 10 12:04:43 crc kubenswrapper[4852]: I1210 12:04:43.622040 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"da6ab6e6d59b48fc3649336aec698d0d2cfeffaec993b20537d2b0a28d9ce5bb"} Dec 10 12:04:43 crc kubenswrapper[4852]: I1210 12:04:43.622051 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"b74db132b05de6f09ce83c567ab0f6c0d31c03ddc31eeb6d2a80b8454c2cf51a"} Dec 10 12:04:44 crc kubenswrapper[4852]: I1210 12:04:44.193814 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c17726-4529-4a16-9d1e-e7e40fa6055a" path="/var/lib/kubelet/pods/64c17726-4529-4a16-9d1e-e7e40fa6055a/volumes" Dec 10 12:04:45 crc kubenswrapper[4852]: I1210 12:04:45.958213 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-rj8fh" Dec 10 12:04:46 crc kubenswrapper[4852]: I1210 12:04:46.644509 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"bda72b44925c56cd259ff9ef1f763996876047db5ef476ed17114ce5e5a0fb66"} Dec 10 12:04:51 crc kubenswrapper[4852]: I1210 12:04:51.679609 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" event={"ID":"f27876cc-b48e-4170-a167-8b4a3e393512","Type":"ContainerStarted","Data":"f10a869368240e6549a22504c45fc9d2df0afee38cc29d572e4e23ffad46a25c"} Dec 10 12:04:51 crc kubenswrapper[4852]: I1210 12:04:51.680308 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:51 crc kubenswrapper[4852]: I1210 12:04:51.680322 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:51 crc kubenswrapper[4852]: I1210 12:04:51.680334 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:51 crc kubenswrapper[4852]: I1210 12:04:51.716143 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:51 crc kubenswrapper[4852]: I1210 12:04:51.719568 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:04:51 crc kubenswrapper[4852]: I1210 12:04:51.720215 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" podStartSLOduration=10.720191004 podStartE2EDuration="10.720191004s" podCreationTimestamp="2025-12-10 12:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:04:51.715731262 +0000 UTC m=+777.801256506" watchObservedRunningTime="2025-12-10 12:04:51.720191004 +0000 UTC m=+777.805716228" Dec 10 12:05:12 crc kubenswrapper[4852]: I1210 12:05:12.050535 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d28qs" Dec 10 12:05:15 crc kubenswrapper[4852]: I1210 12:05:15.789875 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:05:15 crc kubenswrapper[4852]: I1210 12:05:15.790330 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.342869 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr"] Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.345354 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.348818 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.357897 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr"] Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.504721 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.504790 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-554cc\" (UniqueName: \"kubernetes.io/projected/d6c5d4b5-9826-4365-944e-097108097f70-kube-api-access-554cc\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.504974 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.606758 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.606880 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.606906 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-554cc\" (UniqueName: \"kubernetes.io/projected/d6c5d4b5-9826-4365-944e-097108097f70-kube-api-access-554cc\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.607458 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.607479 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.629844 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-554cc\" (UniqueName: \"kubernetes.io/projected/d6c5d4b5-9826-4365-944e-097108097f70-kube-api-access-554cc\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.667048 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.854862 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr"] Dec 10 12:05:31 crc kubenswrapper[4852]: I1210 12:05:31.905452 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" event={"ID":"d6c5d4b5-9826-4365-944e-097108097f70","Type":"ContainerStarted","Data":"995c171aa5f595b60ec854df4eee23d0a6bfa84d56a19b3c2bced92d6b8943a6"} Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.481941 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n6xhs"] Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.483392 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.493884 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6xhs"] Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.620609 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-utilities\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.620984 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-catalog-content\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.621130 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbgxt\" (UniqueName: \"kubernetes.io/projected/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-kube-api-access-rbgxt\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.723038 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-utilities\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.723103 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-catalog-content\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.723137 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbgxt\" (UniqueName: \"kubernetes.io/projected/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-kube-api-access-rbgxt\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.723737 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-catalog-content\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.723853 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-utilities\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.741604 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbgxt\" (UniqueName: \"kubernetes.io/projected/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-kube-api-access-rbgxt\") pod \"redhat-operators-n6xhs\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.808365 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.917577 4852 generic.go:334] "Generic (PLEG): container finished" podID="d6c5d4b5-9826-4365-944e-097108097f70" containerID="a94879a582e613e20966759363856db39e7cba19f18ee7dbbfade7c8c7fc8166" exitCode=0 Dec 10 12:05:32 crc kubenswrapper[4852]: I1210 12:05:32.917710 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" event={"ID":"d6c5d4b5-9826-4365-944e-097108097f70","Type":"ContainerDied","Data":"a94879a582e613e20966759363856db39e7cba19f18ee7dbbfade7c8c7fc8166"} Dec 10 12:05:33 crc kubenswrapper[4852]: I1210 12:05:33.008722 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n6xhs"] Dec 10 12:05:33 crc kubenswrapper[4852]: W1210 12:05:33.019524 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf30b0382_24bf_4e69_9ad7_1071ebf1f2b5.slice/crio-f17b36a1c372ffce6e0e30a62a8f48ca7443c802b83338ff9d7c8bd1fdda988e WatchSource:0}: Error finding container f17b36a1c372ffce6e0e30a62a8f48ca7443c802b83338ff9d7c8bd1fdda988e: Status 404 returned error can't find the container with id f17b36a1c372ffce6e0e30a62a8f48ca7443c802b83338ff9d7c8bd1fdda988e Dec 10 12:05:33 crc kubenswrapper[4852]: I1210 12:05:33.926437 4852 generic.go:334] "Generic (PLEG): container finished" podID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerID="b3f80bd2b9ae30252331fff00747fbf158796ed1863cb8e6f83cf500b674b36c" exitCode=0 Dec 10 12:05:33 crc kubenswrapper[4852]: I1210 12:05:33.926550 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xhs" event={"ID":"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5","Type":"ContainerDied","Data":"b3f80bd2b9ae30252331fff00747fbf158796ed1863cb8e6f83cf500b674b36c"} Dec 10 12:05:33 crc kubenswrapper[4852]: I1210 12:05:33.926895 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xhs" event={"ID":"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5","Type":"ContainerStarted","Data":"f17b36a1c372ffce6e0e30a62a8f48ca7443c802b83338ff9d7c8bd1fdda988e"} Dec 10 12:05:35 crc kubenswrapper[4852]: I1210 12:05:35.953694 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xhs" event={"ID":"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5","Type":"ContainerStarted","Data":"f9156500ce82d73aa7cd53d7f742539c96565f692397d3e3f6d743c30dbfe8ce"} Dec 10 12:05:36 crc kubenswrapper[4852]: I1210 12:05:36.966537 4852 generic.go:334] "Generic (PLEG): container finished" podID="d6c5d4b5-9826-4365-944e-097108097f70" containerID="e4f862464512d35d9a73859e5d37d77a4aa4e8827b5edcaf48ff4a06ee39e124" exitCode=0 Dec 10 12:05:36 crc kubenswrapper[4852]: I1210 12:05:36.966597 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" event={"ID":"d6c5d4b5-9826-4365-944e-097108097f70","Type":"ContainerDied","Data":"e4f862464512d35d9a73859e5d37d77a4aa4e8827b5edcaf48ff4a06ee39e124"} Dec 10 12:05:36 crc kubenswrapper[4852]: I1210 12:05:36.969209 4852 generic.go:334] "Generic (PLEG): container finished" podID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerID="f9156500ce82d73aa7cd53d7f742539c96565f692397d3e3f6d743c30dbfe8ce" exitCode=0 Dec 10 12:05:36 crc kubenswrapper[4852]: I1210 12:05:36.969297 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xhs" event={"ID":"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5","Type":"ContainerDied","Data":"f9156500ce82d73aa7cd53d7f742539c96565f692397d3e3f6d743c30dbfe8ce"} Dec 10 12:05:37 crc kubenswrapper[4852]: I1210 12:05:37.978264 4852 generic.go:334] "Generic (PLEG): container finished" podID="d6c5d4b5-9826-4365-944e-097108097f70" containerID="b3521ea1995ac704f56dce6b7678f41bab3721697ee91ab8bd7a0e2982017ed7" exitCode=0 Dec 10 12:05:37 crc kubenswrapper[4852]: I1210 12:05:37.978518 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" event={"ID":"d6c5d4b5-9826-4365-944e-097108097f70","Type":"ContainerDied","Data":"b3521ea1995ac704f56dce6b7678f41bab3721697ee91ab8bd7a0e2982017ed7"} Dec 10 12:05:38 crc kubenswrapper[4852]: I1210 12:05:38.985434 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xhs" event={"ID":"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5","Type":"ContainerStarted","Data":"2a48fc21ae7d48d789a1223fb44645a44dfe63a295a73452e4980c7a3966194b"} Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.005922 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n6xhs" podStartSLOduration=2.6520872559999997 podStartE2EDuration="7.005905008s" podCreationTimestamp="2025-12-10 12:05:32 +0000 UTC" firstStartedPulling="2025-12-10 12:05:33.92952437 +0000 UTC m=+820.015049594" lastFinishedPulling="2025-12-10 12:05:38.283342122 +0000 UTC m=+824.368867346" observedRunningTime="2025-12-10 12:05:39.003614171 +0000 UTC m=+825.089139395" watchObservedRunningTime="2025-12-10 12:05:39.005905008 +0000 UTC m=+825.091430242" Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.292202 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.408947 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-bundle\") pod \"d6c5d4b5-9826-4365-944e-097108097f70\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.409056 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-554cc\" (UniqueName: \"kubernetes.io/projected/d6c5d4b5-9826-4365-944e-097108097f70-kube-api-access-554cc\") pod \"d6c5d4b5-9826-4365-944e-097108097f70\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.409079 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-util\") pod \"d6c5d4b5-9826-4365-944e-097108097f70\" (UID: \"d6c5d4b5-9826-4365-944e-097108097f70\") " Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.409860 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-bundle" (OuterVolumeSpecName: "bundle") pod "d6c5d4b5-9826-4365-944e-097108097f70" (UID: "d6c5d4b5-9826-4365-944e-097108097f70"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.414567 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c5d4b5-9826-4365-944e-097108097f70-kube-api-access-554cc" (OuterVolumeSpecName: "kube-api-access-554cc") pod "d6c5d4b5-9826-4365-944e-097108097f70" (UID: "d6c5d4b5-9826-4365-944e-097108097f70"). InnerVolumeSpecName "kube-api-access-554cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.428314 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-util" (OuterVolumeSpecName: "util") pod "d6c5d4b5-9826-4365-944e-097108097f70" (UID: "d6c5d4b5-9826-4365-944e-097108097f70"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.511295 4852 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.511367 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-554cc\" (UniqueName: \"kubernetes.io/projected/d6c5d4b5-9826-4365-944e-097108097f70-kube-api-access-554cc\") on node \"crc\" DevicePath \"\"" Dec 10 12:05:39 crc kubenswrapper[4852]: I1210 12:05:39.511387 4852 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6c5d4b5-9826-4365-944e-097108097f70-util\") on node \"crc\" DevicePath \"\"" Dec 10 12:05:40 crc kubenswrapper[4852]: I1210 12:05:40.006486 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" Dec 10 12:05:40 crc kubenswrapper[4852]: I1210 12:05:40.006577 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr" event={"ID":"d6c5d4b5-9826-4365-944e-097108097f70","Type":"ContainerDied","Data":"995c171aa5f595b60ec854df4eee23d0a6bfa84d56a19b3c2bced92d6b8943a6"} Dec 10 12:05:40 crc kubenswrapper[4852]: I1210 12:05:40.006607 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="995c171aa5f595b60ec854df4eee23d0a6bfa84d56a19b3c2bced92d6b8943a6" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.738211 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr"] Dec 10 12:05:42 crc kubenswrapper[4852]: E1210 12:05:42.738807 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c5d4b5-9826-4365-944e-097108097f70" containerName="util" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.738822 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c5d4b5-9826-4365-944e-097108097f70" containerName="util" Dec 10 12:05:42 crc kubenswrapper[4852]: E1210 12:05:42.738843 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c5d4b5-9826-4365-944e-097108097f70" containerName="extract" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.738851 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c5d4b5-9826-4365-944e-097108097f70" containerName="extract" Dec 10 12:05:42 crc kubenswrapper[4852]: E1210 12:05:42.738863 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c5d4b5-9826-4365-944e-097108097f70" containerName="pull" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.738871 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c5d4b5-9826-4365-944e-097108097f70" containerName="pull" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.738980 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c5d4b5-9826-4365-944e-097108097f70" containerName="extract" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.739387 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.741397 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qxnvs" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.741553 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.744634 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.751851 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr"] Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.808508 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.808550 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.871521 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx9l4\" (UniqueName: \"kubernetes.io/projected/35feaa98-be47-42f8-af3b-bf8a5ef57ce4-kube-api-access-kx9l4\") pod \"nmstate-operator-5b5b58f5c8-s8bfr\" (UID: \"35feaa98-be47-42f8-af3b-bf8a5ef57ce4\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.973298 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx9l4\" (UniqueName: \"kubernetes.io/projected/35feaa98-be47-42f8-af3b-bf8a5ef57ce4-kube-api-access-kx9l4\") pod \"nmstate-operator-5b5b58f5c8-s8bfr\" (UID: \"35feaa98-be47-42f8-af3b-bf8a5ef57ce4\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr" Dec 10 12:05:42 crc kubenswrapper[4852]: I1210 12:05:42.996866 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx9l4\" (UniqueName: \"kubernetes.io/projected/35feaa98-be47-42f8-af3b-bf8a5ef57ce4-kube-api-access-kx9l4\") pod \"nmstate-operator-5b5b58f5c8-s8bfr\" (UID: \"35feaa98-be47-42f8-af3b-bf8a5ef57ce4\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr" Dec 10 12:05:43 crc kubenswrapper[4852]: I1210 12:05:43.055351 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr" Dec 10 12:05:43 crc kubenswrapper[4852]: I1210 12:05:43.259902 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr"] Dec 10 12:05:43 crc kubenswrapper[4852]: I1210 12:05:43.855348 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n6xhs" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="registry-server" probeResult="failure" output=< Dec 10 12:05:43 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 12:05:43 crc kubenswrapper[4852]: > Dec 10 12:05:44 crc kubenswrapper[4852]: I1210 12:05:44.026540 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr" event={"ID":"35feaa98-be47-42f8-af3b-bf8a5ef57ce4","Type":"ContainerStarted","Data":"e73bc6c3c8812a9436c9cc77328cd2a5758030e0de1190e81b4a431742ccbd3e"} Dec 10 12:05:45 crc kubenswrapper[4852]: I1210 12:05:45.790258 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:05:45 crc kubenswrapper[4852]: I1210 12:05:45.790732 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:05:52 crc kubenswrapper[4852]: I1210 12:05:52.078688 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr" event={"ID":"35feaa98-be47-42f8-af3b-bf8a5ef57ce4","Type":"ContainerStarted","Data":"8ab178e58b5d95f8f8fe7e53ac795c2a3e8b828f7586febe63d1fd65ea7232d3"} Dec 10 12:05:52 crc kubenswrapper[4852]: I1210 12:05:52.115057 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-s8bfr" podStartSLOduration=2.031914969 podStartE2EDuration="10.115022887s" podCreationTimestamp="2025-12-10 12:05:42 +0000 UTC" firstStartedPulling="2025-12-10 12:05:43.273084516 +0000 UTC m=+829.358609750" lastFinishedPulling="2025-12-10 12:05:51.356192424 +0000 UTC m=+837.441717668" observedRunningTime="2025-12-10 12:05:52.103653442 +0000 UTC m=+838.189178666" watchObservedRunningTime="2025-12-10 12:05:52.115022887 +0000 UTC m=+838.200548151" Dec 10 12:05:52 crc kubenswrapper[4852]: I1210 12:05:52.851587 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:52 crc kubenswrapper[4852]: I1210 12:05:52.890253 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.009624 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.010706 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.012351 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-95lhs" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.020256 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.021127 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.024840 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.025930 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.029879 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-pcbgh"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.030805 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.048549 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.080615 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n6xhs"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.126472 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.127502 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.130738 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.130813 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.131107 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xh4j9" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.145459 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.205443 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-ovs-socket\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.205707 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-nmstate-lock\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.205729 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxd4f\" (UniqueName: \"kubernetes.io/projected/5b77c48c-a8a1-440d-8e0d-fab8d2087ede-kube-api-access-hxd4f\") pod \"nmstate-webhook-5f6d4c5ccb-hx5dz\" (UID: \"5b77c48c-a8a1-440d-8e0d-fab8d2087ede\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.205772 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5b77c48c-a8a1-440d-8e0d-fab8d2087ede-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-hx5dz\" (UID: \"5b77c48c-a8a1-440d-8e0d-fab8d2087ede\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.206020 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9cnw\" (UniqueName: \"kubernetes.io/projected/21f5475e-7988-44d7-940f-76c59cf92f7e-kube-api-access-b9cnw\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.206269 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-dbus-socket\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.206293 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl8sn\" (UniqueName: \"kubernetes.io/projected/94e816ec-cfe3-413c-98f4-5d6f2880d16f-kube-api-access-rl8sn\") pod \"nmstate-metrics-7f946cbc9-2mcc9\" (UID: \"94e816ec-cfe3-413c-98f4-5d6f2880d16f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308054 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxd4f\" (UniqueName: \"kubernetes.io/projected/5b77c48c-a8a1-440d-8e0d-fab8d2087ede-kube-api-access-hxd4f\") pod \"nmstate-webhook-5f6d4c5ccb-hx5dz\" (UID: \"5b77c48c-a8a1-440d-8e0d-fab8d2087ede\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308096 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5b77c48c-a8a1-440d-8e0d-fab8d2087ede-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-hx5dz\" (UID: \"5b77c48c-a8a1-440d-8e0d-fab8d2087ede\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308120 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88acf534-fb28-4e05-bab0-f60364533fae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308141 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9cnw\" (UniqueName: \"kubernetes.io/projected/21f5475e-7988-44d7-940f-76c59cf92f7e-kube-api-access-b9cnw\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308188 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-dbus-socket\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308204 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl8sn\" (UniqueName: \"kubernetes.io/projected/94e816ec-cfe3-413c-98f4-5d6f2880d16f-kube-api-access-rl8sn\") pod \"nmstate-metrics-7f946cbc9-2mcc9\" (UID: \"94e816ec-cfe3-413c-98f4-5d6f2880d16f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308372 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7jbk\" (UniqueName: \"kubernetes.io/projected/88acf534-fb28-4e05-bab0-f60364533fae-kube-api-access-s7jbk\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308474 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-ovs-socket\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308500 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88acf534-fb28-4e05-bab0-f60364533fae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308593 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-nmstate-lock\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.308703 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-nmstate-lock\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.309053 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-ovs-socket\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.309250 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/21f5475e-7988-44d7-940f-76c59cf92f7e-dbus-socket\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.314724 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5b77c48c-a8a1-440d-8e0d-fab8d2087ede-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-hx5dz\" (UID: \"5b77c48c-a8a1-440d-8e0d-fab8d2087ede\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.324849 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxd4f\" (UniqueName: \"kubernetes.io/projected/5b77c48c-a8a1-440d-8e0d-fab8d2087ede-kube-api-access-hxd4f\") pod \"nmstate-webhook-5f6d4c5ccb-hx5dz\" (UID: \"5b77c48c-a8a1-440d-8e0d-fab8d2087ede\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.325806 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9cnw\" (UniqueName: \"kubernetes.io/projected/21f5475e-7988-44d7-940f-76c59cf92f7e-kube-api-access-b9cnw\") pod \"nmstate-handler-pcbgh\" (UID: \"21f5475e-7988-44d7-940f-76c59cf92f7e\") " pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.330445 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl8sn\" (UniqueName: \"kubernetes.io/projected/94e816ec-cfe3-413c-98f4-5d6f2880d16f-kube-api-access-rl8sn\") pod \"nmstate-metrics-7f946cbc9-2mcc9\" (UID: \"94e816ec-cfe3-413c-98f4-5d6f2880d16f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.343177 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.343362 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-56dd54f94b-xqpmz"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.344250 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.347562 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.363926 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56dd54f94b-xqpmz"] Dec 10 12:05:53 crc kubenswrapper[4852]: W1210 12:05:53.387443 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21f5475e_7988_44d7_940f_76c59cf92f7e.slice/crio-cbee677971cc53f0bb736e0e3b577eaad89fe2f1a9067dbb223cbb136a81a623 WatchSource:0}: Error finding container cbee677971cc53f0bb736e0e3b577eaad89fe2f1a9067dbb223cbb136a81a623: Status 404 returned error can't find the container with id cbee677971cc53f0bb736e0e3b577eaad89fe2f1a9067dbb223cbb136a81a623 Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.410158 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7jbk\" (UniqueName: \"kubernetes.io/projected/88acf534-fb28-4e05-bab0-f60364533fae-kube-api-access-s7jbk\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.411636 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88acf534-fb28-4e05-bab0-f60364533fae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.412031 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88acf534-fb28-4e05-bab0-f60364533fae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.413283 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/88acf534-fb28-4e05-bab0-f60364533fae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.419343 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/88acf534-fb28-4e05-bab0-f60364533fae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.430973 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7jbk\" (UniqueName: \"kubernetes.io/projected/88acf534-fb28-4e05-bab0-f60364533fae-kube-api-access-s7jbk\") pod \"nmstate-console-plugin-7fbb5f6569-zzw4g\" (UID: \"88acf534-fb28-4e05-bab0-f60364533fae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.444321 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.513360 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-serving-cert\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.513790 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-config\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.513833 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-oauth-config\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.513885 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-trusted-ca-bundle\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.513973 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlr4s\" (UniqueName: \"kubernetes.io/projected/0052d5c3-37fd-4425-aaaf-65cf952c8894-kube-api-access-dlr4s\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.514006 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-oauth-serving-cert\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.514027 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-service-ca\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.571901 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz"] Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.614878 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlr4s\" (UniqueName: \"kubernetes.io/projected/0052d5c3-37fd-4425-aaaf-65cf952c8894-kube-api-access-dlr4s\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.614926 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-oauth-serving-cert\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.614944 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-service-ca\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.614975 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-serving-cert\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.614993 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-config\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.615016 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-oauth-config\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.615040 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-trusted-ca-bundle\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.616370 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-service-ca\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.616410 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-oauth-serving-cert\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.616515 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-trusted-ca-bundle\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.616797 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-config\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.620869 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-oauth-config\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.621424 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0052d5c3-37fd-4425-aaaf-65cf952c8894-console-serving-cert\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.630125 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.635196 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlr4s\" (UniqueName: \"kubernetes.io/projected/0052d5c3-37fd-4425-aaaf-65cf952c8894-kube-api-access-dlr4s\") pod \"console-56dd54f94b-xqpmz\" (UID: \"0052d5c3-37fd-4425-aaaf-65cf952c8894\") " pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.659002 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g"] Dec 10 12:05:53 crc kubenswrapper[4852]: W1210 12:05:53.667423 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88acf534_fb28_4e05_bab0_f60364533fae.slice/crio-c2488e1603631de6519399eb4c38a0ea6c282de5418d907e09f8f63031f3d1ca WatchSource:0}: Error finding container c2488e1603631de6519399eb4c38a0ea6c282de5418d907e09f8f63031f3d1ca: Status 404 returned error can't find the container with id c2488e1603631de6519399eb4c38a0ea6c282de5418d907e09f8f63031f3d1ca Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.732589 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.801852 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9"] Dec 10 12:05:53 crc kubenswrapper[4852]: W1210 12:05:53.816679 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e816ec_cfe3_413c_98f4_5d6f2880d16f.slice/crio-71dc5f3798f78123461fbc51b2a4c5d311b69eff68508b73f0627f61bdf1a83b WatchSource:0}: Error finding container 71dc5f3798f78123461fbc51b2a4c5d311b69eff68508b73f0627f61bdf1a83b: Status 404 returned error can't find the container with id 71dc5f3798f78123461fbc51b2a4c5d311b69eff68508b73f0627f61bdf1a83b Dec 10 12:05:53 crc kubenswrapper[4852]: I1210 12:05:53.903065 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56dd54f94b-xqpmz"] Dec 10 12:05:53 crc kubenswrapper[4852]: W1210 12:05:53.911987 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0052d5c3_37fd_4425_aaaf_65cf952c8894.slice/crio-9ae32a3f573da3f14b6ec7b24738844ae3d35b39466cbec1892f155eb3efb7a5 WatchSource:0}: Error finding container 9ae32a3f573da3f14b6ec7b24738844ae3d35b39466cbec1892f155eb3efb7a5: Status 404 returned error can't find the container with id 9ae32a3f573da3f14b6ec7b24738844ae3d35b39466cbec1892f155eb3efb7a5 Dec 10 12:05:54 crc kubenswrapper[4852]: I1210 12:05:54.091864 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" event={"ID":"5b77c48c-a8a1-440d-8e0d-fab8d2087ede","Type":"ContainerStarted","Data":"434560511f83454370cc9278e1e2ee72f3120dcf22d1a834678925cc593d6610"} Dec 10 12:05:54 crc kubenswrapper[4852]: I1210 12:05:54.093138 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" event={"ID":"88acf534-fb28-4e05-bab0-f60364533fae","Type":"ContainerStarted","Data":"c2488e1603631de6519399eb4c38a0ea6c282de5418d907e09f8f63031f3d1ca"} Dec 10 12:05:54 crc kubenswrapper[4852]: I1210 12:05:54.094263 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56dd54f94b-xqpmz" event={"ID":"0052d5c3-37fd-4425-aaaf-65cf952c8894","Type":"ContainerStarted","Data":"9ae32a3f573da3f14b6ec7b24738844ae3d35b39466cbec1892f155eb3efb7a5"} Dec 10 12:05:54 crc kubenswrapper[4852]: I1210 12:05:54.095104 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" event={"ID":"94e816ec-cfe3-413c-98f4-5d6f2880d16f","Type":"ContainerStarted","Data":"71dc5f3798f78123461fbc51b2a4c5d311b69eff68508b73f0627f61bdf1a83b"} Dec 10 12:05:54 crc kubenswrapper[4852]: I1210 12:05:54.095981 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pcbgh" event={"ID":"21f5475e-7988-44d7-940f-76c59cf92f7e","Type":"ContainerStarted","Data":"cbee677971cc53f0bb736e0e3b577eaad89fe2f1a9067dbb223cbb136a81a623"} Dec 10 12:05:54 crc kubenswrapper[4852]: I1210 12:05:54.096165 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n6xhs" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="registry-server" containerID="cri-o://2a48fc21ae7d48d789a1223fb44645a44dfe63a295a73452e4980c7a3966194b" gracePeriod=2 Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.104687 4852 generic.go:334] "Generic (PLEG): container finished" podID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerID="2a48fc21ae7d48d789a1223fb44645a44dfe63a295a73452e4980c7a3966194b" exitCode=0 Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.104750 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xhs" event={"ID":"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5","Type":"ContainerDied","Data":"2a48fc21ae7d48d789a1223fb44645a44dfe63a295a73452e4980c7a3966194b"} Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.107737 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56dd54f94b-xqpmz" event={"ID":"0052d5c3-37fd-4425-aaaf-65cf952c8894","Type":"ContainerStarted","Data":"bc466f171645dd0cd1f4c2717ae058ae7b2ec542aeb01a675a32f0fe5b21643b"} Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.123825 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56dd54f94b-xqpmz" podStartSLOduration=2.123805395 podStartE2EDuration="2.123805395s" podCreationTimestamp="2025-12-10 12:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:05:55.122876661 +0000 UTC m=+841.208401905" watchObservedRunningTime="2025-12-10 12:05:55.123805395 +0000 UTC m=+841.209330619" Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.505673 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.647651 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-catalog-content\") pod \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.647715 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbgxt\" (UniqueName: \"kubernetes.io/projected/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-kube-api-access-rbgxt\") pod \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.647804 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-utilities\") pod \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\" (UID: \"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5\") " Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.653954 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-utilities" (OuterVolumeSpecName: "utilities") pod "f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" (UID: "f30b0382-24bf-4e69-9ad7-1071ebf1f2b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.671200 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-kube-api-access-rbgxt" (OuterVolumeSpecName: "kube-api-access-rbgxt") pod "f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" (UID: "f30b0382-24bf-4e69-9ad7-1071ebf1f2b5"). InnerVolumeSpecName "kube-api-access-rbgxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.749519 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbgxt\" (UniqueName: \"kubernetes.io/projected/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-kube-api-access-rbgxt\") on node \"crc\" DevicePath \"\"" Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.749555 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.752090 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" (UID: "f30b0382-24bf-4e69-9ad7-1071ebf1f2b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:05:55 crc kubenswrapper[4852]: I1210 12:05:55.850812 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:05:56 crc kubenswrapper[4852]: I1210 12:05:56.116102 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n6xhs" Dec 10 12:05:56 crc kubenswrapper[4852]: I1210 12:05:56.118451 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n6xhs" event={"ID":"f30b0382-24bf-4e69-9ad7-1071ebf1f2b5","Type":"ContainerDied","Data":"f17b36a1c372ffce6e0e30a62a8f48ca7443c802b83338ff9d7c8bd1fdda988e"} Dec 10 12:05:56 crc kubenswrapper[4852]: I1210 12:05:56.118502 4852 scope.go:117] "RemoveContainer" containerID="2a48fc21ae7d48d789a1223fb44645a44dfe63a295a73452e4980c7a3966194b" Dec 10 12:05:56 crc kubenswrapper[4852]: I1210 12:05:56.157333 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n6xhs"] Dec 10 12:05:56 crc kubenswrapper[4852]: I1210 12:05:56.162658 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n6xhs"] Dec 10 12:05:56 crc kubenswrapper[4852]: I1210 12:05:56.175693 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" path="/var/lib/kubelet/pods/f30b0382-24bf-4e69-9ad7-1071ebf1f2b5/volumes" Dec 10 12:05:56 crc kubenswrapper[4852]: I1210 12:05:56.269041 4852 scope.go:117] "RemoveContainer" containerID="f9156500ce82d73aa7cd53d7f742539c96565f692397d3e3f6d743c30dbfe8ce" Dec 10 12:05:56 crc kubenswrapper[4852]: I1210 12:05:56.335899 4852 scope.go:117] "RemoveContainer" containerID="b3f80bd2b9ae30252331fff00747fbf158796ed1863cb8e6f83cf500b674b36c" Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.133778 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" event={"ID":"94e816ec-cfe3-413c-98f4-5d6f2880d16f","Type":"ContainerStarted","Data":"e43ccd98925d79d3ff5b9ecd07d0c02973e29dddc381117d5724507cbae275aa"} Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.136705 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pcbgh" event={"ID":"21f5475e-7988-44d7-940f-76c59cf92f7e","Type":"ContainerStarted","Data":"4dd301b2d8b949c2e32ce3fb8ec871dd412a16f638850d2cf916d59105d2ffa4"} Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.136938 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.140449 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" event={"ID":"5b77c48c-a8a1-440d-8e0d-fab8d2087ede","Type":"ContainerStarted","Data":"3c4d4a472ee072cc94dbb5ba97ade7b2f1a5518c54e3dbe0f5c8669d88da0f22"} Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.140592 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.142388 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" event={"ID":"88acf534-fb28-4e05-bab0-f60364533fae","Type":"ContainerStarted","Data":"651c8cae8806d503d48d916c9c4fe7e7f01134fa73f21137431ce656ef62d426"} Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.158132 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-pcbgh" podStartSLOduration=2.208089633 podStartE2EDuration="5.15811118s" podCreationTimestamp="2025-12-10 12:05:53 +0000 UTC" firstStartedPulling="2025-12-10 12:05:53.392450353 +0000 UTC m=+839.477975577" lastFinishedPulling="2025-12-10 12:05:56.3424719 +0000 UTC m=+842.427997124" observedRunningTime="2025-12-10 12:05:58.155067413 +0000 UTC m=+844.240592637" watchObservedRunningTime="2025-12-10 12:05:58.15811118 +0000 UTC m=+844.243636404" Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.173748 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" podStartSLOduration=3.405626316 podStartE2EDuration="6.173731205s" podCreationTimestamp="2025-12-10 12:05:52 +0000 UTC" firstStartedPulling="2025-12-10 12:05:53.582703082 +0000 UTC m=+839.668228306" lastFinishedPulling="2025-12-10 12:05:56.350807971 +0000 UTC m=+842.436333195" observedRunningTime="2025-12-10 12:05:58.171263983 +0000 UTC m=+844.256789227" watchObservedRunningTime="2025-12-10 12:05:58.173731205 +0000 UTC m=+844.259256429" Dec 10 12:05:58 crc kubenswrapper[4852]: I1210 12:05:58.209801 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-zzw4g" podStartSLOduration=2.53755746 podStartE2EDuration="5.209775938s" podCreationTimestamp="2025-12-10 12:05:53 +0000 UTC" firstStartedPulling="2025-12-10 12:05:53.669989366 +0000 UTC m=+839.755514590" lastFinishedPulling="2025-12-10 12:05:56.342207844 +0000 UTC m=+842.427733068" observedRunningTime="2025-12-10 12:05:58.208732301 +0000 UTC m=+844.294257545" watchObservedRunningTime="2025-12-10 12:05:58.209775938 +0000 UTC m=+844.295301162" Dec 10 12:06:01 crc kubenswrapper[4852]: I1210 12:06:01.161937 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" event={"ID":"94e816ec-cfe3-413c-98f4-5d6f2880d16f","Type":"ContainerStarted","Data":"1c8d66ae967069a91b5e50249c4b214427547e7ef3c906cfb49829e06ad16f22"} Dec 10 12:06:01 crc kubenswrapper[4852]: I1210 12:06:01.186023 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-2mcc9" podStartSLOduration=2.850554119 podStartE2EDuration="9.18600433s" podCreationTimestamp="2025-12-10 12:05:52 +0000 UTC" firstStartedPulling="2025-12-10 12:05:53.820403878 +0000 UTC m=+839.905929102" lastFinishedPulling="2025-12-10 12:06:00.155854089 +0000 UTC m=+846.241379313" observedRunningTime="2025-12-10 12:06:01.185241761 +0000 UTC m=+847.270766995" watchObservedRunningTime="2025-12-10 12:06:01.18600433 +0000 UTC m=+847.271529554" Dec 10 12:06:03 crc kubenswrapper[4852]: I1210 12:06:03.366890 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-pcbgh" Dec 10 12:06:03 crc kubenswrapper[4852]: I1210 12:06:03.733103 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:06:03 crc kubenswrapper[4852]: I1210 12:06:03.733510 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:06:03 crc kubenswrapper[4852]: I1210 12:06:03.738441 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:06:04 crc kubenswrapper[4852]: I1210 12:06:04.181416 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56dd54f94b-xqpmz" Dec 10 12:06:04 crc kubenswrapper[4852]: I1210 12:06:04.232371 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-c82cd"] Dec 10 12:06:13 crc kubenswrapper[4852]: I1210 12:06:13.349525 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-hx5dz" Dec 10 12:06:15 crc kubenswrapper[4852]: I1210 12:06:15.790353 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:06:15 crc kubenswrapper[4852]: I1210 12:06:15.790691 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:06:15 crc kubenswrapper[4852]: I1210 12:06:15.790739 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:06:15 crc kubenswrapper[4852]: I1210 12:06:15.791378 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77c211572bcee4c8a77c07da48869683ba7551ebec91c3aa4c5542663748ddba"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:06:15 crc kubenswrapper[4852]: I1210 12:06:15.791445 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://77c211572bcee4c8a77c07da48869683ba7551ebec91c3aa4c5542663748ddba" gracePeriod=600 Dec 10 12:06:16 crc kubenswrapper[4852]: I1210 12:06:16.246774 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="77c211572bcee4c8a77c07da48869683ba7551ebec91c3aa4c5542663748ddba" exitCode=0 Dec 10 12:06:16 crc kubenswrapper[4852]: I1210 12:06:16.246829 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"77c211572bcee4c8a77c07da48869683ba7551ebec91c3aa4c5542663748ddba"} Dec 10 12:06:16 crc kubenswrapper[4852]: I1210 12:06:16.246863 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"1526ef670a66096e17d3fb224d460b0768d7f2150066a4bc7f3d701b213bd881"} Dec 10 12:06:16 crc kubenswrapper[4852]: I1210 12:06:16.246884 4852 scope.go:117] "RemoveContainer" containerID="5f347330cd2d5cdf86ff9446a444fd62f87cac15b734887bf743393441452d4f" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.675978 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt"] Dec 10 12:06:28 crc kubenswrapper[4852]: E1210 12:06:28.677748 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="registry-server" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.677831 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="registry-server" Dec 10 12:06:28 crc kubenswrapper[4852]: E1210 12:06:28.677908 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="extract-content" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.677998 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="extract-content" Dec 10 12:06:28 crc kubenswrapper[4852]: E1210 12:06:28.678079 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="extract-utilities" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.678151 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="extract-utilities" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.678386 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f30b0382-24bf-4e69-9ad7-1071ebf1f2b5" containerName="registry-server" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.679396 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.686539 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt"] Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.689398 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.783612 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwpjk\" (UniqueName: \"kubernetes.io/projected/fe5305bc-61f3-4176-902a-5e0c821b9ff3-kube-api-access-qwpjk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.783657 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.783701 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.885347 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwpjk\" (UniqueName: \"kubernetes.io/projected/fe5305bc-61f3-4176-902a-5e0c821b9ff3-kube-api-access-qwpjk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.885847 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.885917 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.886690 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.886810 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:28 crc kubenswrapper[4852]: I1210 12:06:28.909052 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwpjk\" (UniqueName: \"kubernetes.io/projected/fe5305bc-61f3-4176-902a-5e0c821b9ff3-kube-api-access-qwpjk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:29 crc kubenswrapper[4852]: I1210 12:06:29.002116 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:29 crc kubenswrapper[4852]: I1210 12:06:29.271898 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-c82cd" podUID="736a1895-9f79-4788-9f63-5b9b3406540d" containerName="console" containerID="cri-o://0ccd9f9c506f014c9b442e188cc33c8af27019ebc3e8d53ea579729aa43e2da9" gracePeriod=15 Dec 10 12:06:29 crc kubenswrapper[4852]: I1210 12:06:29.986062 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt"] Dec 10 12:06:30 crc kubenswrapper[4852]: W1210 12:06:30.055084 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe5305bc_61f3_4176_902a_5e0c821b9ff3.slice/crio-920fcb49505eb16d9801736a61f4bb23893e4c1f73e481d66344ce87acdb6f36 WatchSource:0}: Error finding container 920fcb49505eb16d9801736a61f4bb23893e4c1f73e481d66344ce87acdb6f36: Status 404 returned error can't find the container with id 920fcb49505eb16d9801736a61f4bb23893e4c1f73e481d66344ce87acdb6f36 Dec 10 12:06:30 crc kubenswrapper[4852]: I1210 12:06:30.347459 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" event={"ID":"fe5305bc-61f3-4176-902a-5e0c821b9ff3","Type":"ContainerStarted","Data":"920fcb49505eb16d9801736a61f4bb23893e4c1f73e481d66344ce87acdb6f36"} Dec 10 12:06:30 crc kubenswrapper[4852]: I1210 12:06:30.349311 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-c82cd_736a1895-9f79-4788-9f63-5b9b3406540d/console/0.log" Dec 10 12:06:30 crc kubenswrapper[4852]: I1210 12:06:30.349355 4852 generic.go:334] "Generic (PLEG): container finished" podID="736a1895-9f79-4788-9f63-5b9b3406540d" containerID="0ccd9f9c506f014c9b442e188cc33c8af27019ebc3e8d53ea579729aa43e2da9" exitCode=2 Dec 10 12:06:30 crc kubenswrapper[4852]: I1210 12:06:30.349379 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c82cd" event={"ID":"736a1895-9f79-4788-9f63-5b9b3406540d","Type":"ContainerDied","Data":"0ccd9f9c506f014c9b442e188cc33c8af27019ebc3e8d53ea579729aa43e2da9"} Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.355914 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" event={"ID":"fe5305bc-61f3-4176-902a-5e0c821b9ff3","Type":"ContainerStarted","Data":"6799cd3e737e8ded086832860cc7b4d0d85f6a1d33e5f407067a52b2ed133e9e"} Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.554705 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-c82cd_736a1895-9f79-4788-9f63-5b9b3406540d/console/0.log" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.554767 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c82cd" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.719961 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-trusted-ca-bundle\") pod \"736a1895-9f79-4788-9f63-5b9b3406540d\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.720079 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-service-ca\") pod \"736a1895-9f79-4788-9f63-5b9b3406540d\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.720130 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-oauth-config\") pod \"736a1895-9f79-4788-9f63-5b9b3406540d\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.720158 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-console-config\") pod \"736a1895-9f79-4788-9f63-5b9b3406540d\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.720173 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-serving-cert\") pod \"736a1895-9f79-4788-9f63-5b9b3406540d\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.720201 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mm4n\" (UniqueName: \"kubernetes.io/projected/736a1895-9f79-4788-9f63-5b9b3406540d-kube-api-access-7mm4n\") pod \"736a1895-9f79-4788-9f63-5b9b3406540d\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.720221 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-oauth-serving-cert\") pod \"736a1895-9f79-4788-9f63-5b9b3406540d\" (UID: \"736a1895-9f79-4788-9f63-5b9b3406540d\") " Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.721060 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-console-config" (OuterVolumeSpecName: "console-config") pod "736a1895-9f79-4788-9f63-5b9b3406540d" (UID: "736a1895-9f79-4788-9f63-5b9b3406540d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.721080 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "736a1895-9f79-4788-9f63-5b9b3406540d" (UID: "736a1895-9f79-4788-9f63-5b9b3406540d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.721073 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-service-ca" (OuterVolumeSpecName: "service-ca") pod "736a1895-9f79-4788-9f63-5b9b3406540d" (UID: "736a1895-9f79-4788-9f63-5b9b3406540d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.721491 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "736a1895-9f79-4788-9f63-5b9b3406540d" (UID: "736a1895-9f79-4788-9f63-5b9b3406540d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.725649 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "736a1895-9f79-4788-9f63-5b9b3406540d" (UID: "736a1895-9f79-4788-9f63-5b9b3406540d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.725927 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "736a1895-9f79-4788-9f63-5b9b3406540d" (UID: "736a1895-9f79-4788-9f63-5b9b3406540d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.726074 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736a1895-9f79-4788-9f63-5b9b3406540d-kube-api-access-7mm4n" (OuterVolumeSpecName: "kube-api-access-7mm4n") pod "736a1895-9f79-4788-9f63-5b9b3406540d" (UID: "736a1895-9f79-4788-9f63-5b9b3406540d"). InnerVolumeSpecName "kube-api-access-7mm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.821603 4852 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-service-ca\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.821647 4852 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.821664 4852 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-console-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.821678 4852 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/736a1895-9f79-4788-9f63-5b9b3406540d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.821693 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mm4n\" (UniqueName: \"kubernetes.io/projected/736a1895-9f79-4788-9f63-5b9b3406540d-kube-api-access-7mm4n\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.821706 4852 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:31 crc kubenswrapper[4852]: I1210 12:06:31.821719 4852 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/736a1895-9f79-4788-9f63-5b9b3406540d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:32 crc kubenswrapper[4852]: I1210 12:06:32.363651 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-c82cd_736a1895-9f79-4788-9f63-5b9b3406540d/console/0.log" Dec 10 12:06:32 crc kubenswrapper[4852]: I1210 12:06:32.364018 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c82cd" event={"ID":"736a1895-9f79-4788-9f63-5b9b3406540d","Type":"ContainerDied","Data":"532933a7b127e0613d8ccdd91fa264b484e95787261c27dc07c0bf2ae823eaa0"} Dec 10 12:06:32 crc kubenswrapper[4852]: I1210 12:06:32.364055 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c82cd" Dec 10 12:06:32 crc kubenswrapper[4852]: I1210 12:06:32.364068 4852 scope.go:117] "RemoveContainer" containerID="0ccd9f9c506f014c9b442e188cc33c8af27019ebc3e8d53ea579729aa43e2da9" Dec 10 12:06:32 crc kubenswrapper[4852]: I1210 12:06:32.367434 4852 generic.go:334] "Generic (PLEG): container finished" podID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerID="6799cd3e737e8ded086832860cc7b4d0d85f6a1d33e5f407067a52b2ed133e9e" exitCode=0 Dec 10 12:06:32 crc kubenswrapper[4852]: I1210 12:06:32.367478 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" event={"ID":"fe5305bc-61f3-4176-902a-5e0c821b9ff3","Type":"ContainerDied","Data":"6799cd3e737e8ded086832860cc7b4d0d85f6a1d33e5f407067a52b2ed133e9e"} Dec 10 12:06:32 crc kubenswrapper[4852]: I1210 12:06:32.384206 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-c82cd"] Dec 10 12:06:32 crc kubenswrapper[4852]: I1210 12:06:32.387393 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-c82cd"] Dec 10 12:06:34 crc kubenswrapper[4852]: I1210 12:06:34.176783 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736a1895-9f79-4788-9f63-5b9b3406540d" path="/var/lib/kubelet/pods/736a1895-9f79-4788-9f63-5b9b3406540d/volumes" Dec 10 12:06:38 crc kubenswrapper[4852]: I1210 12:06:38.403712 4852 generic.go:334] "Generic (PLEG): container finished" podID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerID="9c5e846f5004b8b3193db7eb485ac4d83f414f5c22a1a2f990705eac56f3e538" exitCode=0 Dec 10 12:06:38 crc kubenswrapper[4852]: I1210 12:06:38.403761 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" event={"ID":"fe5305bc-61f3-4176-902a-5e0c821b9ff3","Type":"ContainerDied","Data":"9c5e846f5004b8b3193db7eb485ac4d83f414f5c22a1a2f990705eac56f3e538"} Dec 10 12:06:39 crc kubenswrapper[4852]: I1210 12:06:39.411977 4852 generic.go:334] "Generic (PLEG): container finished" podID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerID="8217b1502130eceeebd91fc187ef2703603d44c62a27c12ac12334f30ab4859f" exitCode=0 Dec 10 12:06:39 crc kubenswrapper[4852]: I1210 12:06:39.412056 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" event={"ID":"fe5305bc-61f3-4176-902a-5e0c821b9ff3","Type":"ContainerDied","Data":"8217b1502130eceeebd91fc187ef2703603d44c62a27c12ac12334f30ab4859f"} Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.652185 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.736655 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwpjk\" (UniqueName: \"kubernetes.io/projected/fe5305bc-61f3-4176-902a-5e0c821b9ff3-kube-api-access-qwpjk\") pod \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.736906 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-bundle\") pod \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.737078 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-util\") pod \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\" (UID: \"fe5305bc-61f3-4176-902a-5e0c821b9ff3\") " Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.738889 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-bundle" (OuterVolumeSpecName: "bundle") pod "fe5305bc-61f3-4176-902a-5e0c821b9ff3" (UID: "fe5305bc-61f3-4176-902a-5e0c821b9ff3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.743774 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5305bc-61f3-4176-902a-5e0c821b9ff3-kube-api-access-qwpjk" (OuterVolumeSpecName: "kube-api-access-qwpjk") pod "fe5305bc-61f3-4176-902a-5e0c821b9ff3" (UID: "fe5305bc-61f3-4176-902a-5e0c821b9ff3"). InnerVolumeSpecName "kube-api-access-qwpjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.752232 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-util" (OuterVolumeSpecName: "util") pod "fe5305bc-61f3-4176-902a-5e0c821b9ff3" (UID: "fe5305bc-61f3-4176-902a-5e0c821b9ff3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.838891 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwpjk\" (UniqueName: \"kubernetes.io/projected/fe5305bc-61f3-4176-902a-5e0c821b9ff3-kube-api-access-qwpjk\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.838943 4852 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:40 crc kubenswrapper[4852]: I1210 12:06:40.838956 4852 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe5305bc-61f3-4176-902a-5e0c821b9ff3-util\") on node \"crc\" DevicePath \"\"" Dec 10 12:06:41 crc kubenswrapper[4852]: I1210 12:06:41.433148 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" event={"ID":"fe5305bc-61f3-4176-902a-5e0c821b9ff3","Type":"ContainerDied","Data":"920fcb49505eb16d9801736a61f4bb23893e4c1f73e481d66344ce87acdb6f36"} Dec 10 12:06:41 crc kubenswrapper[4852]: I1210 12:06:41.433186 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="920fcb49505eb16d9801736a61f4bb23893e4c1f73e481d66344ce87acdb6f36" Dec 10 12:06:41 crc kubenswrapper[4852]: I1210 12:06:41.433207 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.686619 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b"] Dec 10 12:06:52 crc kubenswrapper[4852]: E1210 12:06:52.687662 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736a1895-9f79-4788-9f63-5b9b3406540d" containerName="console" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.687683 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="736a1895-9f79-4788-9f63-5b9b3406540d" containerName="console" Dec 10 12:06:52 crc kubenswrapper[4852]: E1210 12:06:52.687695 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerName="util" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.687705 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerName="util" Dec 10 12:06:52 crc kubenswrapper[4852]: E1210 12:06:52.687730 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerName="pull" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.687738 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerName="pull" Dec 10 12:06:52 crc kubenswrapper[4852]: E1210 12:06:52.687745 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerName="extract" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.687752 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerName="extract" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.687888 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="736a1895-9f79-4788-9f63-5b9b3406540d" containerName="console" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.687910 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5305bc-61f3-4176-902a-5e0c821b9ff3" containerName="extract" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.688553 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.690876 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.691454 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.691737 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.691763 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-cbxhn" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.691740 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.710629 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b"] Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.790981 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57t2j\" (UniqueName: \"kubernetes.io/projected/356ac40e-2e68-4d75-81ca-b1e3306e263a-kube-api-access-57t2j\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.791145 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/356ac40e-2e68-4d75-81ca-b1e3306e263a-apiservice-cert\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.791182 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/356ac40e-2e68-4d75-81ca-b1e3306e263a-webhook-cert\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.893099 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57t2j\" (UniqueName: \"kubernetes.io/projected/356ac40e-2e68-4d75-81ca-b1e3306e263a-kube-api-access-57t2j\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.893167 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/356ac40e-2e68-4d75-81ca-b1e3306e263a-apiservice-cert\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.893194 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/356ac40e-2e68-4d75-81ca-b1e3306e263a-webhook-cert\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.898718 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/356ac40e-2e68-4d75-81ca-b1e3306e263a-webhook-cert\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.899890 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/356ac40e-2e68-4d75-81ca-b1e3306e263a-apiservice-cert\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.914213 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57t2j\" (UniqueName: \"kubernetes.io/projected/356ac40e-2e68-4d75-81ca-b1e3306e263a-kube-api-access-57t2j\") pod \"metallb-operator-controller-manager-5c67dbf94b-rhs8b\" (UID: \"356ac40e-2e68-4d75-81ca-b1e3306e263a\") " pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.945012 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv"] Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.945758 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.949093 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.949175 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6q6kf" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.949734 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.968606 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv"] Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.994660 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c92ae4cf-27c3-46d4-9be9-8398e1276f61-webhook-cert\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.994764 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drpxb\" (UniqueName: \"kubernetes.io/projected/c92ae4cf-27c3-46d4-9be9-8398e1276f61-kube-api-access-drpxb\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:52 crc kubenswrapper[4852]: I1210 12:06:52.994877 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c92ae4cf-27c3-46d4-9be9-8398e1276f61-apiservice-cert\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.010692 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.096492 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drpxb\" (UniqueName: \"kubernetes.io/projected/c92ae4cf-27c3-46d4-9be9-8398e1276f61-kube-api-access-drpxb\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.096552 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c92ae4cf-27c3-46d4-9be9-8398e1276f61-apiservice-cert\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.096593 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c92ae4cf-27c3-46d4-9be9-8398e1276f61-webhook-cert\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.100947 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c92ae4cf-27c3-46d4-9be9-8398e1276f61-apiservice-cert\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.110472 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c92ae4cf-27c3-46d4-9be9-8398e1276f61-webhook-cert\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.114602 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drpxb\" (UniqueName: \"kubernetes.io/projected/c92ae4cf-27c3-46d4-9be9-8398e1276f61-kube-api-access-drpxb\") pod \"metallb-operator-webhook-server-7d78c58b5f-5mtdv\" (UID: \"c92ae4cf-27c3-46d4-9be9-8398e1276f61\") " pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.265494 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.339695 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b"] Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.467101 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv"] Dec 10 12:06:53 crc kubenswrapper[4852]: W1210 12:06:53.485272 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc92ae4cf_27c3_46d4_9be9_8398e1276f61.slice/crio-b553ed29c43e90444e1fc831c55eec9df4689d5881ed50f2899c6b4ff87ad672 WatchSource:0}: Error finding container b553ed29c43e90444e1fc831c55eec9df4689d5881ed50f2899c6b4ff87ad672: Status 404 returned error can't find the container with id b553ed29c43e90444e1fc831c55eec9df4689d5881ed50f2899c6b4ff87ad672 Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.511396 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" event={"ID":"356ac40e-2e68-4d75-81ca-b1e3306e263a","Type":"ContainerStarted","Data":"445ad73f8620010f7ec691050801fe012bba65b733ecc1966c896ff9a0d5cee8"} Dec 10 12:06:53 crc kubenswrapper[4852]: I1210 12:06:53.513403 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" event={"ID":"c92ae4cf-27c3-46d4-9be9-8398e1276f61","Type":"ContainerStarted","Data":"b553ed29c43e90444e1fc831c55eec9df4689d5881ed50f2899c6b4ff87ad672"} Dec 10 12:06:58 crc kubenswrapper[4852]: I1210 12:06:58.553359 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" event={"ID":"356ac40e-2e68-4d75-81ca-b1e3306e263a","Type":"ContainerStarted","Data":"2e6c74dc4b19fda3f70dbc304b7e2643c57997293d7ff282d4f7177cfcae4f1b"} Dec 10 12:06:58 crc kubenswrapper[4852]: I1210 12:06:58.554013 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:07:04 crc kubenswrapper[4852]: I1210 12:07:04.597299 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" event={"ID":"c92ae4cf-27c3-46d4-9be9-8398e1276f61","Type":"ContainerStarted","Data":"b04b51d310639ed6269d180e0687f048bc5235d37d55d88199594ddbf11a7b6f"} Dec 10 12:07:04 crc kubenswrapper[4852]: I1210 12:07:04.597886 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:07:04 crc kubenswrapper[4852]: I1210 12:07:04.621784 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" podStartSLOduration=7.969898333 podStartE2EDuration="12.621763222s" podCreationTimestamp="2025-12-10 12:06:52 +0000 UTC" firstStartedPulling="2025-12-10 12:06:53.351863446 +0000 UTC m=+899.437388670" lastFinishedPulling="2025-12-10 12:06:58.003728335 +0000 UTC m=+904.089253559" observedRunningTime="2025-12-10 12:06:58.571695585 +0000 UTC m=+904.657220939" watchObservedRunningTime="2025-12-10 12:07:04.621763222 +0000 UTC m=+910.707288446" Dec 10 12:07:04 crc kubenswrapper[4852]: I1210 12:07:04.622436 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" podStartSLOduration=2.916309814 podStartE2EDuration="12.622428849s" podCreationTimestamp="2025-12-10 12:06:52 +0000 UTC" firstStartedPulling="2025-12-10 12:06:53.488746862 +0000 UTC m=+899.574272086" lastFinishedPulling="2025-12-10 12:07:03.194865897 +0000 UTC m=+909.280391121" observedRunningTime="2025-12-10 12:07:04.614337124 +0000 UTC m=+910.699862338" watchObservedRunningTime="2025-12-10 12:07:04.622428849 +0000 UTC m=+910.707954073" Dec 10 12:07:10 crc kubenswrapper[4852]: I1210 12:07:10.976396 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gpfh6"] Dec 10 12:07:10 crc kubenswrapper[4852]: I1210 12:07:10.979151 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:10 crc kubenswrapper[4852]: I1210 12:07:10.991778 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpfh6"] Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.058990 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-utilities\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.059059 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-catalog-content\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.059096 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jkh2\" (UniqueName: \"kubernetes.io/projected/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-kube-api-access-6jkh2\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.160609 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-utilities\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.160685 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-catalog-content\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.160745 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jkh2\" (UniqueName: \"kubernetes.io/projected/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-kube-api-access-6jkh2\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.161354 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-utilities\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.161471 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-catalog-content\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.190138 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jkh2\" (UniqueName: \"kubernetes.io/projected/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-kube-api-access-6jkh2\") pod \"redhat-marketplace-gpfh6\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.295018 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.527168 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpfh6"] Dec 10 12:07:11 crc kubenswrapper[4852]: W1210 12:07:11.530757 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974b7b03_d8b7_4d6b_8cfe_ee9113fbf263.slice/crio-f7a69c249980898a55b261efee181cad8df8f1a9ab19ca02b2cf5badc24e182f WatchSource:0}: Error finding container f7a69c249980898a55b261efee181cad8df8f1a9ab19ca02b2cf5badc24e182f: Status 404 returned error can't find the container with id f7a69c249980898a55b261efee181cad8df8f1a9ab19ca02b2cf5badc24e182f Dec 10 12:07:11 crc kubenswrapper[4852]: I1210 12:07:11.637971 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpfh6" event={"ID":"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263","Type":"ContainerStarted","Data":"f7a69c249980898a55b261efee181cad8df8f1a9ab19ca02b2cf5badc24e182f"} Dec 10 12:07:12 crc kubenswrapper[4852]: I1210 12:07:12.646257 4852 generic.go:334] "Generic (PLEG): container finished" podID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerID="df66e53b37bd201e44687edbc8dc902da20fe87cc38f18da4c1e0ba696cdf8ff" exitCode=0 Dec 10 12:07:12 crc kubenswrapper[4852]: I1210 12:07:12.646431 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpfh6" event={"ID":"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263","Type":"ContainerDied","Data":"df66e53b37bd201e44687edbc8dc902da20fe87cc38f18da4c1e0ba696cdf8ff"} Dec 10 12:07:13 crc kubenswrapper[4852]: I1210 12:07:13.270448 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d78c58b5f-5mtdv" Dec 10 12:07:15 crc kubenswrapper[4852]: I1210 12:07:15.984655 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s4cg2"] Dec 10 12:07:15 crc kubenswrapper[4852]: I1210 12:07:15.986267 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:15 crc kubenswrapper[4852]: I1210 12:07:15.997962 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4cg2"] Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.185934 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-catalog-content\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.186194 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-utilities\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.186219 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shsf8\" (UniqueName: \"kubernetes.io/projected/acff76af-d815-49a0-bb14-c3b2c3ff2287-kube-api-access-shsf8\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.287378 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-utilities\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.287421 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shsf8\" (UniqueName: \"kubernetes.io/projected/acff76af-d815-49a0-bb14-c3b2c3ff2287-kube-api-access-shsf8\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.287477 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-catalog-content\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.287992 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-catalog-content\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.287998 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-utilities\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.322195 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shsf8\" (UniqueName: \"kubernetes.io/projected/acff76af-d815-49a0-bb14-c3b2c3ff2287-kube-api-access-shsf8\") pod \"certified-operators-s4cg2\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:16 crc kubenswrapper[4852]: I1210 12:07:16.601105 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:17 crc kubenswrapper[4852]: I1210 12:07:17.042614 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4cg2"] Dec 10 12:07:17 crc kubenswrapper[4852]: W1210 12:07:17.046355 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacff76af_d815_49a0_bb14_c3b2c3ff2287.slice/crio-159d4cbe0f45c58d29ce031da8683416f71908302d2daf153b45de96f4f6996e WatchSource:0}: Error finding container 159d4cbe0f45c58d29ce031da8683416f71908302d2daf153b45de96f4f6996e: Status 404 returned error can't find the container with id 159d4cbe0f45c58d29ce031da8683416f71908302d2daf153b45de96f4f6996e Dec 10 12:07:17 crc kubenswrapper[4852]: I1210 12:07:17.684223 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cg2" event={"ID":"acff76af-d815-49a0-bb14-c3b2c3ff2287","Type":"ContainerStarted","Data":"159d4cbe0f45c58d29ce031da8683416f71908302d2daf153b45de96f4f6996e"} Dec 10 12:07:20 crc kubenswrapper[4852]: I1210 12:07:20.703777 4852 generic.go:334] "Generic (PLEG): container finished" podID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerID="0e26c1253124b6a68da975ce4c767901fefdb7d54b95e7f94c264f220b034b9d" exitCode=0 Dec 10 12:07:20 crc kubenswrapper[4852]: I1210 12:07:20.703869 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpfh6" event={"ID":"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263","Type":"ContainerDied","Data":"0e26c1253124b6a68da975ce4c767901fefdb7d54b95e7f94c264f220b034b9d"} Dec 10 12:07:20 crc kubenswrapper[4852]: I1210 12:07:20.706073 4852 generic.go:334] "Generic (PLEG): container finished" podID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerID="844cce677af90fe468e06894725548c77412892ae2e0640cfa4c06174dfa939e" exitCode=0 Dec 10 12:07:20 crc kubenswrapper[4852]: I1210 12:07:20.706112 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cg2" event={"ID":"acff76af-d815-49a0-bb14-c3b2c3ff2287","Type":"ContainerDied","Data":"844cce677af90fe468e06894725548c77412892ae2e0640cfa4c06174dfa939e"} Dec 10 12:07:22 crc kubenswrapper[4852]: I1210 12:07:22.742691 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpfh6" event={"ID":"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263","Type":"ContainerStarted","Data":"9741e69f5a3dd9e96d2405ac26392f9d0b649357e34db305816c3cb3d5b8a449"} Dec 10 12:07:22 crc kubenswrapper[4852]: I1210 12:07:22.761735 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gpfh6" podStartSLOduration=3.102288224 podStartE2EDuration="12.761719043s" podCreationTimestamp="2025-12-10 12:07:10 +0000 UTC" firstStartedPulling="2025-12-10 12:07:12.647937901 +0000 UTC m=+918.733463125" lastFinishedPulling="2025-12-10 12:07:22.30736872 +0000 UTC m=+928.392893944" observedRunningTime="2025-12-10 12:07:22.757993729 +0000 UTC m=+928.843518963" watchObservedRunningTime="2025-12-10 12:07:22.761719043 +0000 UTC m=+928.847244267" Dec 10 12:07:23 crc kubenswrapper[4852]: I1210 12:07:23.751813 4852 generic.go:334] "Generic (PLEG): container finished" podID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerID="c756d0e1214d9b3aaf6d8b9afdc5e660569210107346f3a75ab797dc523705fe" exitCode=0 Dec 10 12:07:23 crc kubenswrapper[4852]: I1210 12:07:23.751919 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cg2" event={"ID":"acff76af-d815-49a0-bb14-c3b2c3ff2287","Type":"ContainerDied","Data":"c756d0e1214d9b3aaf6d8b9afdc5e660569210107346f3a75ab797dc523705fe"} Dec 10 12:07:27 crc kubenswrapper[4852]: I1210 12:07:27.780696 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cg2" event={"ID":"acff76af-d815-49a0-bb14-c3b2c3ff2287","Type":"ContainerStarted","Data":"c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13"} Dec 10 12:07:27 crc kubenswrapper[4852]: I1210 12:07:27.803747 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s4cg2" podStartSLOduration=6.558309905 podStartE2EDuration="12.803726728s" podCreationTimestamp="2025-12-10 12:07:15 +0000 UTC" firstStartedPulling="2025-12-10 12:07:20.706955591 +0000 UTC m=+926.792480815" lastFinishedPulling="2025-12-10 12:07:26.952372404 +0000 UTC m=+933.037897638" observedRunningTime="2025-12-10 12:07:27.797298306 +0000 UTC m=+933.882823560" watchObservedRunningTime="2025-12-10 12:07:27.803726728 +0000 UTC m=+933.889251962" Dec 10 12:07:31 crc kubenswrapper[4852]: I1210 12:07:31.295771 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:31 crc kubenswrapper[4852]: I1210 12:07:31.296101 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:31 crc kubenswrapper[4852]: I1210 12:07:31.334729 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:31 crc kubenswrapper[4852]: I1210 12:07:31.842867 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.013461 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5c67dbf94b-rhs8b" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.723587 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pddhf"] Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.726527 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.728134 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b"] Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.728457 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.728765 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vxnls" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.729067 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.731256 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.731462 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.743018 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b"] Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.805906 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jfz84"] Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.807069 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jfz84" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.808568 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xgjz6" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.808579 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.809905 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.809914 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.819284 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-gj6ss"] Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.820325 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.823224 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.831269 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-gj6ss"] Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.837913 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knkzb\" (UniqueName: \"kubernetes.io/projected/9c84ab71-bcb9-4237-a827-4fe3c1c2c754-kube-api-access-knkzb\") pod \"frr-k8s-webhook-server-7fcb986d4-9cr6b\" (UID: \"9c84ab71-bcb9-4237-a827-4fe3c1c2c754\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.838134 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-conf\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.838193 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics-certs\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.838265 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-sockets\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.838285 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-reloader\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.838354 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wblbr\" (UniqueName: \"kubernetes.io/projected/17cd493c-8f5c-4567-8959-cf6ae0011e51-kube-api-access-wblbr\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.838374 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-startup\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.838419 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.838445 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c84ab71-bcb9-4237-a827-4fe3c1c2c754-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9cr6b\" (UID: \"9c84ab71-bcb9-4237-a827-4fe3c1c2c754\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.939752 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwr5\" (UniqueName: \"kubernetes.io/projected/2335158c-cbc5-45a0-9438-a879aede67f1-kube-api-access-6pwr5\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.939819 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wblbr\" (UniqueName: \"kubernetes.io/projected/17cd493c-8f5c-4567-8959-cf6ae0011e51-kube-api-access-wblbr\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.939939 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-startup\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.939988 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940030 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c84ab71-bcb9-4237-a827-4fe3c1c2c754-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9cr6b\" (UID: \"9c84ab71-bcb9-4237-a827-4fe3c1c2c754\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940060 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-metrics-certs\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940086 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940139 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knkzb\" (UniqueName: \"kubernetes.io/projected/9c84ab71-bcb9-4237-a827-4fe3c1c2c754-kube-api-access-knkzb\") pod \"frr-k8s-webhook-server-7fcb986d4-9cr6b\" (UID: \"9c84ab71-bcb9-4237-a827-4fe3c1c2c754\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940160 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/357f1ff0-29a8-4905-bac8-9bc8a5c03199-metallb-excludel2\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940189 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-conf\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940216 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-cert\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940259 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics-certs\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940289 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-metrics-certs\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940316 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdv8c\" (UniqueName: \"kubernetes.io/projected/357f1ff0-29a8-4905-bac8-9bc8a5c03199-kube-api-access-rdv8c\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940353 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-sockets\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.940381 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-reloader\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.941026 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-reloader\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: E1210 12:07:33.941166 4852 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 10 12:07:33 crc kubenswrapper[4852]: E1210 12:07:33.941253 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics-certs podName:17cd493c-8f5c-4567-8959-cf6ae0011e51 nodeName:}" failed. No retries permitted until 2025-12-10 12:07:34.441208707 +0000 UTC m=+940.526733941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics-certs") pod "frr-k8s-pddhf" (UID: "17cd493c-8f5c-4567-8959-cf6ae0011e51") : secret "frr-k8s-certs-secret" not found Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.941352 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-conf\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.941536 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.941749 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-sockets\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.942216 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/17cd493c-8f5c-4567-8959-cf6ae0011e51-frr-startup\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.947441 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c84ab71-bcb9-4237-a827-4fe3c1c2c754-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-9cr6b\" (UID: \"9c84ab71-bcb9-4237-a827-4fe3c1c2c754\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.972988 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wblbr\" (UniqueName: \"kubernetes.io/projected/17cd493c-8f5c-4567-8959-cf6ae0011e51-kube-api-access-wblbr\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:33 crc kubenswrapper[4852]: I1210 12:07:33.974647 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knkzb\" (UniqueName: \"kubernetes.io/projected/9c84ab71-bcb9-4237-a827-4fe3c1c2c754-kube-api-access-knkzb\") pod \"frr-k8s-webhook-server-7fcb986d4-9cr6b\" (UID: \"9c84ab71-bcb9-4237-a827-4fe3c1c2c754\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.041464 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/357f1ff0-29a8-4905-bac8-9bc8a5c03199-metallb-excludel2\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.041521 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-cert\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.041576 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-metrics-certs\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.041602 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdv8c\" (UniqueName: \"kubernetes.io/projected/357f1ff0-29a8-4905-bac8-9bc8a5c03199-kube-api-access-rdv8c\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.041652 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwr5\" (UniqueName: \"kubernetes.io/projected/2335158c-cbc5-45a0-9438-a879aede67f1-kube-api-access-6pwr5\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.041708 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-metrics-certs\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.041731 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:34 crc kubenswrapper[4852]: E1210 12:07:34.041882 4852 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 10 12:07:34 crc kubenswrapper[4852]: E1210 12:07:34.041939 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist podName:357f1ff0-29a8-4905-bac8-9bc8a5c03199 nodeName:}" failed. No retries permitted until 2025-12-10 12:07:34.541920747 +0000 UTC m=+940.627445971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist") pod "speaker-jfz84" (UID: "357f1ff0-29a8-4905-bac8-9bc8a5c03199") : secret "metallb-memberlist" not found Dec 10 12:07:34 crc kubenswrapper[4852]: E1210 12:07:34.042171 4852 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.042201 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/357f1ff0-29a8-4905-bac8-9bc8a5c03199-metallb-excludel2\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:34 crc kubenswrapper[4852]: E1210 12:07:34.042223 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-metrics-certs podName:2335158c-cbc5-45a0-9438-a879aede67f1 nodeName:}" failed. No retries permitted until 2025-12-10 12:07:34.542205804 +0000 UTC m=+940.627731048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-metrics-certs") pod "controller-f8648f98b-gj6ss" (UID: "2335158c-cbc5-45a0-9438-a879aede67f1") : secret "controller-certs-secret" not found Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.046590 4852 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.046776 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-metrics-certs\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.048734 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.059784 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-cert\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.068115 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwr5\" (UniqueName: \"kubernetes.io/projected/2335158c-cbc5-45a0-9438-a879aede67f1-kube-api-access-6pwr5\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.075902 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdv8c\" (UniqueName: \"kubernetes.io/projected/357f1ff0-29a8-4905-bac8-9bc8a5c03199-kube-api-access-rdv8c\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.182112 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpfh6"] Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.182654 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gpfh6" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerName="registry-server" containerID="cri-o://9741e69f5a3dd9e96d2405ac26392f9d0b649357e34db305816c3cb3d5b8a449" gracePeriod=2 Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.318097 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b"] Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.446556 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics-certs\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.451398 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17cd493c-8f5c-4567-8959-cf6ae0011e51-metrics-certs\") pod \"frr-k8s-pddhf\" (UID: \"17cd493c-8f5c-4567-8959-cf6ae0011e51\") " pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.547274 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-metrics-certs\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.547319 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:34 crc kubenswrapper[4852]: E1210 12:07:34.547467 4852 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 10 12:07:34 crc kubenswrapper[4852]: E1210 12:07:34.547526 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist podName:357f1ff0-29a8-4905-bac8-9bc8a5c03199 nodeName:}" failed. No retries permitted until 2025-12-10 12:07:35.547509008 +0000 UTC m=+941.633034232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist") pod "speaker-jfz84" (UID: "357f1ff0-29a8-4905-bac8-9bc8a5c03199") : secret "metallb-memberlist" not found Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.550933 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2335158c-cbc5-45a0-9438-a879aede67f1-metrics-certs\") pod \"controller-f8648f98b-gj6ss\" (UID: \"2335158c-cbc5-45a0-9438-a879aede67f1\") " pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.643363 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pddhf" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.732698 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.827847 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerStarted","Data":"24e7adc1953ef2d849f21d773a98154b73606785e611088e4bd5e55a7096a48a"} Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.828922 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" event={"ID":"9c84ab71-bcb9-4237-a827-4fe3c1c2c754","Type":"ContainerStarted","Data":"7e64610775a61910fb1b720e3d4cb0ab9d2551b2239f6006d83636ded45ed907"} Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.831023 4852 generic.go:334] "Generic (PLEG): container finished" podID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerID="9741e69f5a3dd9e96d2405ac26392f9d0b649357e34db305816c3cb3d5b8a449" exitCode=0 Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.831061 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpfh6" event={"ID":"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263","Type":"ContainerDied","Data":"9741e69f5a3dd9e96d2405ac26392f9d0b649357e34db305816c3cb3d5b8a449"} Dec 10 12:07:34 crc kubenswrapper[4852]: I1210 12:07:34.926207 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-gj6ss"] Dec 10 12:07:34 crc kubenswrapper[4852]: W1210 12:07:34.929437 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2335158c_cbc5_45a0_9438_a879aede67f1.slice/crio-8ab5506d619bbe9e428c6fcb1fa7a6d0e7825a54efc453f44b55ed25d1c09be0 WatchSource:0}: Error finding container 8ab5506d619bbe9e428c6fcb1fa7a6d0e7825a54efc453f44b55ed25d1c09be0: Status 404 returned error can't find the container with id 8ab5506d619bbe9e428c6fcb1fa7a6d0e7825a54efc453f44b55ed25d1c09be0 Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.104801 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.256039 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jkh2\" (UniqueName: \"kubernetes.io/projected/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-kube-api-access-6jkh2\") pod \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.256393 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-utilities\") pod \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.256560 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-catalog-content\") pod \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\" (UID: \"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263\") " Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.257422 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-utilities" (OuterVolumeSpecName: "utilities") pod "974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" (UID: "974b7b03-d8b7-4d6b-8cfe-ee9113fbf263"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.267945 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-kube-api-access-6jkh2" (OuterVolumeSpecName: "kube-api-access-6jkh2") pod "974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" (UID: "974b7b03-d8b7-4d6b-8cfe-ee9113fbf263"). InnerVolumeSpecName "kube-api-access-6jkh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.274952 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" (UID: "974b7b03-d8b7-4d6b-8cfe-ee9113fbf263"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.357752 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.357792 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jkh2\" (UniqueName: \"kubernetes.io/projected/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-kube-api-access-6jkh2\") on node \"crc\" DevicePath \"\"" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.357804 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.560292 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.564933 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/357f1ff0-29a8-4905-bac8-9bc8a5c03199-memberlist\") pod \"speaker-jfz84\" (UID: \"357f1ff0-29a8-4905-bac8-9bc8a5c03199\") " pod="metallb-system/speaker-jfz84" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.619489 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jfz84" Dec 10 12:07:35 crc kubenswrapper[4852]: W1210 12:07:35.637547 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod357f1ff0_29a8_4905_bac8_9bc8a5c03199.slice/crio-c43b4dea074b0af807fc44c3f444366a68dca7b7104a8bbaef982fff5ec0e834 WatchSource:0}: Error finding container c43b4dea074b0af807fc44c3f444366a68dca7b7104a8bbaef982fff5ec0e834: Status 404 returned error can't find the container with id c43b4dea074b0af807fc44c3f444366a68dca7b7104a8bbaef982fff5ec0e834 Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.793163 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kz6bs"] Dec 10 12:07:35 crc kubenswrapper[4852]: E1210 12:07:35.793429 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerName="extract-content" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.793445 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerName="extract-content" Dec 10 12:07:35 crc kubenswrapper[4852]: E1210 12:07:35.793455 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerName="registry-server" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.793462 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerName="registry-server" Dec 10 12:07:35 crc kubenswrapper[4852]: E1210 12:07:35.793473 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerName="extract-utilities" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.793479 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerName="extract-utilities" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.793584 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" containerName="registry-server" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.794332 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.823150 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kz6bs"] Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.850083 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpfh6" event={"ID":"974b7b03-d8b7-4d6b-8cfe-ee9113fbf263","Type":"ContainerDied","Data":"f7a69c249980898a55b261efee181cad8df8f1a9ab19ca02b2cf5badc24e182f"} Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.850135 4852 scope.go:117] "RemoveContainer" containerID="9741e69f5a3dd9e96d2405ac26392f9d0b649357e34db305816c3cb3d5b8a449" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.850271 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpfh6" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.857845 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jfz84" event={"ID":"357f1ff0-29a8-4905-bac8-9bc8a5c03199","Type":"ContainerStarted","Data":"c43b4dea074b0af807fc44c3f444366a68dca7b7104a8bbaef982fff5ec0e834"} Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.875065 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gj6ss" event={"ID":"2335158c-cbc5-45a0-9438-a879aede67f1","Type":"ContainerStarted","Data":"a100611aa33124a4a6f304c53a9065d05f8a00658fd3bb135d49d2290b29396e"} Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.875108 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gj6ss" event={"ID":"2335158c-cbc5-45a0-9438-a879aede67f1","Type":"ContainerStarted","Data":"36f9019f22cd3cef85779dc305ad4cdeebb47fa29ff40162450126b607b3b3b2"} Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.875119 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-gj6ss" event={"ID":"2335158c-cbc5-45a0-9438-a879aede67f1","Type":"ContainerStarted","Data":"8ab5506d619bbe9e428c6fcb1fa7a6d0e7825a54efc453f44b55ed25d1c09be0"} Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.884263 4852 scope.go:117] "RemoveContainer" containerID="0e26c1253124b6a68da975ce4c767901fefdb7d54b95e7f94c264f220b034b9d" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.920862 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpfh6"] Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.927790 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpfh6"] Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.932271 4852 scope.go:117] "RemoveContainer" containerID="df66e53b37bd201e44687edbc8dc902da20fe87cc38f18da4c1e0ba696cdf8ff" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.965322 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzklz\" (UniqueName: \"kubernetes.io/projected/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-kube-api-access-lzklz\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.965371 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-utilities\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:35 crc kubenswrapper[4852]: I1210 12:07:35.965611 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-catalog-content\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.066418 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzklz\" (UniqueName: \"kubernetes.io/projected/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-kube-api-access-lzklz\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.066475 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-utilities\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.066545 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-catalog-content\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.067250 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-catalog-content\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.067272 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-utilities\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:36 crc kubenswrapper[4852]: E1210 12:07:36.068814 4852 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974b7b03_d8b7_4d6b_8cfe_ee9113fbf263.slice\": RecentStats: unable to find data in memory cache]" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.084400 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzklz\" (UniqueName: \"kubernetes.io/projected/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-kube-api-access-lzklz\") pod \"community-operators-kz6bs\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.182176 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974b7b03-d8b7-4d6b-8cfe-ee9113fbf263" path="/var/lib/kubelet/pods/974b7b03-d8b7-4d6b-8cfe-ee9113fbf263/volumes" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.234286 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.603638 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.604604 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.693013 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.781336 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kz6bs"] Dec 10 12:07:36 crc kubenswrapper[4852]: W1210 12:07:36.789905 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b2230b6_b922_4cba_b7c9_d00ae0cb1bc3.slice/crio-9af0237cde7d5a7d460fc4042ff947f2684be9912382625dfc62fed36649184e WatchSource:0}: Error finding container 9af0237cde7d5a7d460fc4042ff947f2684be9912382625dfc62fed36649184e: Status 404 returned error can't find the container with id 9af0237cde7d5a7d460fc4042ff947f2684be9912382625dfc62fed36649184e Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.903777 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jfz84" event={"ID":"357f1ff0-29a8-4905-bac8-9bc8a5c03199","Type":"ContainerStarted","Data":"82b01b555dfd617cba85526d11b3e2800537bb3a5ffadda60e92c875431882ed"} Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.905871 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz6bs" event={"ID":"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3","Type":"ContainerStarted","Data":"9af0237cde7d5a7d460fc4042ff947f2684be9912382625dfc62fed36649184e"} Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.906372 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:36 crc kubenswrapper[4852]: I1210 12:07:36.984897 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:37 crc kubenswrapper[4852]: I1210 12:07:37.015128 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-gj6ss" podStartSLOduration=4.015102612 podStartE2EDuration="4.015102612s" podCreationTimestamp="2025-12-10 12:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:07:36.927091754 +0000 UTC m=+943.012616978" watchObservedRunningTime="2025-12-10 12:07:37.015102612 +0000 UTC m=+943.100627846" Dec 10 12:07:37 crc kubenswrapper[4852]: I1210 12:07:37.914130 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jfz84" event={"ID":"357f1ff0-29a8-4905-bac8-9bc8a5c03199","Type":"ContainerStarted","Data":"1c52ff7d5128d9d29ff822358945f7766d4dd1c2c5cda2fd98bf44eae2c86833"} Dec 10 12:07:37 crc kubenswrapper[4852]: I1210 12:07:37.914288 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jfz84" Dec 10 12:07:37 crc kubenswrapper[4852]: I1210 12:07:37.916332 4852 generic.go:334] "Generic (PLEG): container finished" podID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerID="1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032" exitCode=0 Dec 10 12:07:37 crc kubenswrapper[4852]: I1210 12:07:37.916394 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz6bs" event={"ID":"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3","Type":"ContainerDied","Data":"1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032"} Dec 10 12:07:37 crc kubenswrapper[4852]: I1210 12:07:37.931054 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jfz84" podStartSLOduration=4.931037902 podStartE2EDuration="4.931037902s" podCreationTimestamp="2025-12-10 12:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:07:37.929900474 +0000 UTC m=+944.015425708" watchObservedRunningTime="2025-12-10 12:07:37.931037902 +0000 UTC m=+944.016563146" Dec 10 12:07:39 crc kubenswrapper[4852]: I1210 12:07:39.938886 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz6bs" event={"ID":"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3","Type":"ContainerStarted","Data":"ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa"} Dec 10 12:07:40 crc kubenswrapper[4852]: I1210 12:07:40.771602 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s4cg2"] Dec 10 12:07:40 crc kubenswrapper[4852]: I1210 12:07:40.771838 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s4cg2" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="registry-server" containerID="cri-o://c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13" gracePeriod=2 Dec 10 12:07:40 crc kubenswrapper[4852]: I1210 12:07:40.949316 4852 generic.go:334] "Generic (PLEG): container finished" podID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerID="ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa" exitCode=0 Dec 10 12:07:40 crc kubenswrapper[4852]: I1210 12:07:40.949391 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz6bs" event={"ID":"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3","Type":"ContainerDied","Data":"ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa"} Dec 10 12:07:41 crc kubenswrapper[4852]: I1210 12:07:41.956466 4852 generic.go:334] "Generic (PLEG): container finished" podID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerID="c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13" exitCode=0 Dec 10 12:07:41 crc kubenswrapper[4852]: I1210 12:07:41.956509 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cg2" event={"ID":"acff76af-d815-49a0-bb14-c3b2c3ff2287","Type":"ContainerDied","Data":"c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13"} Dec 10 12:07:46 crc kubenswrapper[4852]: E1210 12:07:46.601930 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13 is running failed: container process not found" containerID="c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 12:07:46 crc kubenswrapper[4852]: E1210 12:07:46.603969 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13 is running failed: container process not found" containerID="c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 12:07:46 crc kubenswrapper[4852]: E1210 12:07:46.604713 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13 is running failed: container process not found" containerID="c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13" cmd=["grpc_health_probe","-addr=:50051"] Dec 10 12:07:46 crc kubenswrapper[4852]: E1210 12:07:46.604749 4852 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-s4cg2" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="registry-server" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.430099 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.478312 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shsf8\" (UniqueName: \"kubernetes.io/projected/acff76af-d815-49a0-bb14-c3b2c3ff2287-kube-api-access-shsf8\") pod \"acff76af-d815-49a0-bb14-c3b2c3ff2287\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.478482 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-catalog-content\") pod \"acff76af-d815-49a0-bb14-c3b2c3ff2287\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.478545 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-utilities\") pod \"acff76af-d815-49a0-bb14-c3b2c3ff2287\" (UID: \"acff76af-d815-49a0-bb14-c3b2c3ff2287\") " Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.479551 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-utilities" (OuterVolumeSpecName: "utilities") pod "acff76af-d815-49a0-bb14-c3b2c3ff2287" (UID: "acff76af-d815-49a0-bb14-c3b2c3ff2287"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.484410 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acff76af-d815-49a0-bb14-c3b2c3ff2287-kube-api-access-shsf8" (OuterVolumeSpecName: "kube-api-access-shsf8") pod "acff76af-d815-49a0-bb14-c3b2c3ff2287" (UID: "acff76af-d815-49a0-bb14-c3b2c3ff2287"). InnerVolumeSpecName "kube-api-access-shsf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.524258 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acff76af-d815-49a0-bb14-c3b2c3ff2287" (UID: "acff76af-d815-49a0-bb14-c3b2c3ff2287"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.580325 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.580396 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shsf8\" (UniqueName: \"kubernetes.io/projected/acff76af-d815-49a0-bb14-c3b2c3ff2287-kube-api-access-shsf8\") on node \"crc\" DevicePath \"\"" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.580418 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acff76af-d815-49a0-bb14-c3b2c3ff2287-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.996274 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cg2" event={"ID":"acff76af-d815-49a0-bb14-c3b2c3ff2287","Type":"ContainerDied","Data":"159d4cbe0f45c58d29ce031da8683416f71908302d2daf153b45de96f4f6996e"} Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.996331 4852 scope.go:117] "RemoveContainer" containerID="c08c348f0a12ca479eb8dd93b4dd396df45c66ad955871f074056e5bf9228f13" Dec 10 12:07:47 crc kubenswrapper[4852]: I1210 12:07:47.996364 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4cg2" Dec 10 12:07:48 crc kubenswrapper[4852]: I1210 12:07:48.025460 4852 scope.go:117] "RemoveContainer" containerID="c756d0e1214d9b3aaf6d8b9afdc5e660569210107346f3a75ab797dc523705fe" Dec 10 12:07:48 crc kubenswrapper[4852]: I1210 12:07:48.029459 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s4cg2"] Dec 10 12:07:48 crc kubenswrapper[4852]: I1210 12:07:48.037858 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s4cg2"] Dec 10 12:07:48 crc kubenswrapper[4852]: I1210 12:07:48.044597 4852 scope.go:117] "RemoveContainer" containerID="844cce677af90fe468e06894725548c77412892ae2e0640cfa4c06174dfa939e" Dec 10 12:07:48 crc kubenswrapper[4852]: I1210 12:07:48.180294 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" path="/var/lib/kubelet/pods/acff76af-d815-49a0-bb14-c3b2c3ff2287/volumes" Dec 10 12:07:51 crc kubenswrapper[4852]: E1210 12:07:51.560105 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/frr-rhel9@sha256:e5c5e7ca4ed54c9edba5dfa1d504bbe58016c2abdc872ebb8b26a628958e5a2a" Dec 10 12:07:51 crc kubenswrapper[4852]: E1210 12:07:51.560800 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:cp-frr-files,Image:registry.redhat.io/openshift4/frr-rhel9@sha256:e5c5e7ca4ed54c9edba5dfa1d504bbe58016c2abdc872ebb8b26a628958e5a2a,Command:[/bin/sh -c cp -rLf /tmp/frr/* /etc/frr/],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:frr-startup,ReadOnly:false,MountPath:/tmp/frr,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:frr-conf,ReadOnly:false,MountPath:/etc/frr,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wblbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*100,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*101,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod frr-k8s-pddhf_metallb-system(17cd493c-8f5c-4567-8959-cf6ae0011e51): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 10 12:07:51 crc kubenswrapper[4852]: E1210 12:07:51.562973 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cp-frr-files\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="metallb-system/frr-k8s-pddhf" podUID="17cd493c-8f5c-4567-8959-cf6ae0011e51" Dec 10 12:07:52 crc kubenswrapper[4852]: E1210 12:07:52.830024 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cp-frr-files\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/frr-rhel9@sha256:e5c5e7ca4ed54c9edba5dfa1d504bbe58016c2abdc872ebb8b26a628958e5a2a\\\"\"" pod="metallb-system/frr-k8s-pddhf" podUID="17cd493c-8f5c-4567-8959-cf6ae0011e51" Dec 10 12:07:54 crc kubenswrapper[4852]: I1210 12:07:54.049808 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz6bs" event={"ID":"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3","Type":"ContainerStarted","Data":"21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438"} Dec 10 12:07:54 crc kubenswrapper[4852]: I1210 12:07:54.052020 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" event={"ID":"9c84ab71-bcb9-4237-a827-4fe3c1c2c754","Type":"ContainerStarted","Data":"02bcf2a96ac7dde991cf7ce4a0883a197a534b9fcf4fcacafa2ca3b25a344bda"} Dec 10 12:07:54 crc kubenswrapper[4852]: I1210 12:07:54.052418 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:07:54 crc kubenswrapper[4852]: I1210 12:07:54.069709 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kz6bs" podStartSLOduration=3.506920182 podStartE2EDuration="19.069693544s" podCreationTimestamp="2025-12-10 12:07:35 +0000 UTC" firstStartedPulling="2025-12-10 12:07:37.918652659 +0000 UTC m=+944.004177883" lastFinishedPulling="2025-12-10 12:07:53.481426021 +0000 UTC m=+959.566951245" observedRunningTime="2025-12-10 12:07:54.06755406 +0000 UTC m=+960.153079284" watchObservedRunningTime="2025-12-10 12:07:54.069693544 +0000 UTC m=+960.155218768" Dec 10 12:07:54 crc kubenswrapper[4852]: I1210 12:07:54.084756 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" podStartSLOduration=1.915232007 podStartE2EDuration="21.084737115s" podCreationTimestamp="2025-12-10 12:07:33 +0000 UTC" firstStartedPulling="2025-12-10 12:07:34.333045098 +0000 UTC m=+940.418570322" lastFinishedPulling="2025-12-10 12:07:53.502550206 +0000 UTC m=+959.588075430" observedRunningTime="2025-12-10 12:07:54.083915084 +0000 UTC m=+960.169440308" watchObservedRunningTime="2025-12-10 12:07:54.084737115 +0000 UTC m=+960.170262339" Dec 10 12:07:54 crc kubenswrapper[4852]: I1210 12:07:54.737071 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-gj6ss" Dec 10 12:07:55 crc kubenswrapper[4852]: I1210 12:07:55.627532 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jfz84" Dec 10 12:07:56 crc kubenswrapper[4852]: I1210 12:07:56.234812 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:56 crc kubenswrapper[4852]: I1210 12:07:56.234862 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:07:56 crc kubenswrapper[4852]: I1210 12:07:56.281064 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.985487 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kv2w5"] Dec 10 12:08:01 crc kubenswrapper[4852]: E1210 12:08:01.986278 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="registry-server" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.986293 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="registry-server" Dec 10 12:08:01 crc kubenswrapper[4852]: E1210 12:08:01.986308 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="extract-utilities" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.986314 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="extract-utilities" Dec 10 12:08:01 crc kubenswrapper[4852]: E1210 12:08:01.986324 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="extract-content" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.986330 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="extract-content" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.986443 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="acff76af-d815-49a0-bb14-c3b2c3ff2287" containerName="registry-server" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.986864 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kv2w5" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.989714 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.989736 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 10 12:08:01 crc kubenswrapper[4852]: I1210 12:08:01.989930 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fddqv" Dec 10 12:08:02 crc kubenswrapper[4852]: I1210 12:08:01.998016 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kv2w5"] Dec 10 12:08:02 crc kubenswrapper[4852]: I1210 12:08:02.177801 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2xnh\" (UniqueName: \"kubernetes.io/projected/75639596-a335-4cb1-a5b9-1cab464d990a-kube-api-access-t2xnh\") pod \"openstack-operator-index-kv2w5\" (UID: \"75639596-a335-4cb1-a5b9-1cab464d990a\") " pod="openstack-operators/openstack-operator-index-kv2w5" Dec 10 12:08:02 crc kubenswrapper[4852]: I1210 12:08:02.279073 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xnh\" (UniqueName: \"kubernetes.io/projected/75639596-a335-4cb1-a5b9-1cab464d990a-kube-api-access-t2xnh\") pod \"openstack-operator-index-kv2w5\" (UID: \"75639596-a335-4cb1-a5b9-1cab464d990a\") " pod="openstack-operators/openstack-operator-index-kv2w5" Dec 10 12:08:02 crc kubenswrapper[4852]: I1210 12:08:02.308604 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xnh\" (UniqueName: \"kubernetes.io/projected/75639596-a335-4cb1-a5b9-1cab464d990a-kube-api-access-t2xnh\") pod \"openstack-operator-index-kv2w5\" (UID: \"75639596-a335-4cb1-a5b9-1cab464d990a\") " pod="openstack-operators/openstack-operator-index-kv2w5" Dec 10 12:08:02 crc kubenswrapper[4852]: I1210 12:08:02.351296 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kv2w5" Dec 10 12:08:02 crc kubenswrapper[4852]: I1210 12:08:02.780403 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kv2w5"] Dec 10 12:08:03 crc kubenswrapper[4852]: I1210 12:08:03.123917 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kv2w5" event={"ID":"75639596-a335-4cb1-a5b9-1cab464d990a","Type":"ContainerStarted","Data":"df2be452f20fd7883351dc3d83f3a025263d9bbc9430e82b8d5574dc2b3186af"} Dec 10 12:08:04 crc kubenswrapper[4852]: I1210 12:08:04.055184 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-9cr6b" Dec 10 12:08:06 crc kubenswrapper[4852]: I1210 12:08:06.281391 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:08:07 crc kubenswrapper[4852]: I1210 12:08:07.571517 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kv2w5"] Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.171634 4852 generic.go:334] "Generic (PLEG): container finished" podID="17cd493c-8f5c-4567-8959-cf6ae0011e51" containerID="30482aaeab1eea82372c2090e32015b8741e6d386e1795404b4723ed5aa5e408" exitCode=0 Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.179996 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerDied","Data":"30482aaeab1eea82372c2090e32015b8741e6d386e1795404b4723ed5aa5e408"} Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.180042 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kv2w5" event={"ID":"75639596-a335-4cb1-a5b9-1cab464d990a","Type":"ContainerStarted","Data":"dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39"} Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.193008 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kv2w5" podStartSLOduration=2.631632994 podStartE2EDuration="7.192988772s" podCreationTimestamp="2025-12-10 12:08:01 +0000 UTC" firstStartedPulling="2025-12-10 12:08:02.792140401 +0000 UTC m=+968.877665625" lastFinishedPulling="2025-12-10 12:08:07.353496179 +0000 UTC m=+973.439021403" observedRunningTime="2025-12-10 12:08:08.189297261 +0000 UTC m=+974.274822485" watchObservedRunningTime="2025-12-10 12:08:08.192988772 +0000 UTC m=+974.278513996" Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.383226 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nhqss"] Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.384761 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.395873 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nhqss"] Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.490896 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr7rb\" (UniqueName: \"kubernetes.io/projected/62488b9a-bd45-4d7e-a890-f2d585698d58-kube-api-access-qr7rb\") pod \"openstack-operator-index-nhqss\" (UID: \"62488b9a-bd45-4d7e-a890-f2d585698d58\") " pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.591842 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr7rb\" (UniqueName: \"kubernetes.io/projected/62488b9a-bd45-4d7e-a890-f2d585698d58-kube-api-access-qr7rb\") pod \"openstack-operator-index-nhqss\" (UID: \"62488b9a-bd45-4d7e-a890-f2d585698d58\") " pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.613045 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr7rb\" (UniqueName: \"kubernetes.io/projected/62488b9a-bd45-4d7e-a890-f2d585698d58-kube-api-access-qr7rb\") pod \"openstack-operator-index-nhqss\" (UID: \"62488b9a-bd45-4d7e-a890-f2d585698d58\") " pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:08 crc kubenswrapper[4852]: I1210 12:08:08.757591 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:09 crc kubenswrapper[4852]: I1210 12:08:09.183048 4852 generic.go:334] "Generic (PLEG): container finished" podID="17cd493c-8f5c-4567-8959-cf6ae0011e51" containerID="bb675e4bd8581f70fb20583dc78b674a49a72f00723c20e37677dabd864ba9b5" exitCode=0 Dec 10 12:08:09 crc kubenswrapper[4852]: I1210 12:08:09.183106 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerDied","Data":"bb675e4bd8581f70fb20583dc78b674a49a72f00723c20e37677dabd864ba9b5"} Dec 10 12:08:09 crc kubenswrapper[4852]: I1210 12:08:09.183544 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-kv2w5" podUID="75639596-a335-4cb1-a5b9-1cab464d990a" containerName="registry-server" containerID="cri-o://dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39" gracePeriod=2 Dec 10 12:08:09 crc kubenswrapper[4852]: I1210 12:08:09.242701 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nhqss"] Dec 10 12:08:09 crc kubenswrapper[4852]: I1210 12:08:09.568841 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kv2w5" Dec 10 12:08:09 crc kubenswrapper[4852]: I1210 12:08:09.608333 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2xnh\" (UniqueName: \"kubernetes.io/projected/75639596-a335-4cb1-a5b9-1cab464d990a-kube-api-access-t2xnh\") pod \"75639596-a335-4cb1-a5b9-1cab464d990a\" (UID: \"75639596-a335-4cb1-a5b9-1cab464d990a\") " Dec 10 12:08:09 crc kubenswrapper[4852]: I1210 12:08:09.615403 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75639596-a335-4cb1-a5b9-1cab464d990a-kube-api-access-t2xnh" (OuterVolumeSpecName: "kube-api-access-t2xnh") pod "75639596-a335-4cb1-a5b9-1cab464d990a" (UID: "75639596-a335-4cb1-a5b9-1cab464d990a"). InnerVolumeSpecName "kube-api-access-t2xnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:08:09 crc kubenswrapper[4852]: I1210 12:08:09.710475 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2xnh\" (UniqueName: \"kubernetes.io/projected/75639596-a335-4cb1-a5b9-1cab464d990a-kube-api-access-t2xnh\") on node \"crc\" DevicePath \"\"" Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.192212 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nhqss" event={"ID":"62488b9a-bd45-4d7e-a890-f2d585698d58","Type":"ContainerStarted","Data":"916f055cf9637a7f2e28a0adf8e8630a1bf057228c0dd6cd3a288845f34cf311"} Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.192289 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nhqss" event={"ID":"62488b9a-bd45-4d7e-a890-f2d585698d58","Type":"ContainerStarted","Data":"40c262462e6cdd686d91a30a2502d0a186565b5bbe7c151ff8c14d9d8d58e1a7"} Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.195293 4852 generic.go:334] "Generic (PLEG): container finished" podID="75639596-a335-4cb1-a5b9-1cab464d990a" containerID="dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39" exitCode=0 Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.195345 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kv2w5" Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.195413 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kv2w5" event={"ID":"75639596-a335-4cb1-a5b9-1cab464d990a","Type":"ContainerDied","Data":"dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39"} Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.195500 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kv2w5" event={"ID":"75639596-a335-4cb1-a5b9-1cab464d990a","Type":"ContainerDied","Data":"df2be452f20fd7883351dc3d83f3a025263d9bbc9430e82b8d5574dc2b3186af"} Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.195532 4852 scope.go:117] "RemoveContainer" containerID="dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39" Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.199605 4852 generic.go:334] "Generic (PLEG): container finished" podID="17cd493c-8f5c-4567-8959-cf6ae0011e51" containerID="9b913a31ca13b5b101a02932ab60e2ba6a8ede26196885fb0e8fa0ae80c86b6d" exitCode=0 Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.199655 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerDied","Data":"9b913a31ca13b5b101a02932ab60e2ba6a8ede26196885fb0e8fa0ae80c86b6d"} Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.217515 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nhqss" podStartSLOduration=2.165044147 podStartE2EDuration="2.217480413s" podCreationTimestamp="2025-12-10 12:08:08 +0000 UTC" firstStartedPulling="2025-12-10 12:08:09.306276533 +0000 UTC m=+975.391801757" lastFinishedPulling="2025-12-10 12:08:09.358712799 +0000 UTC m=+975.444238023" observedRunningTime="2025-12-10 12:08:10.212222722 +0000 UTC m=+976.297747946" watchObservedRunningTime="2025-12-10 12:08:10.217480413 +0000 UTC m=+976.303005637" Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.225897 4852 scope.go:117] "RemoveContainer" containerID="dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39" Dec 10 12:08:10 crc kubenswrapper[4852]: E1210 12:08:10.228477 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39\": container with ID starting with dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39 not found: ID does not exist" containerID="dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39" Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.228528 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39"} err="failed to get container status \"dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39\": rpc error: code = NotFound desc = could not find container \"dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39\": container with ID starting with dbe0f03edb79e31325435fbbbcf6a5400eafe336dd561803e3882b33c6725e39 not found: ID does not exist" Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.231376 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kv2w5"] Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.235927 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-kv2w5"] Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.573349 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kz6bs"] Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.574039 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kz6bs" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerName="registry-server" containerID="cri-o://21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438" gracePeriod=2 Dec 10 12:08:10 crc kubenswrapper[4852]: I1210 12:08:10.977897 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.029777 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzklz\" (UniqueName: \"kubernetes.io/projected/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-kube-api-access-lzklz\") pod \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.029914 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-catalog-content\") pod \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.029938 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-utilities\") pod \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\" (UID: \"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3\") " Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.031777 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-utilities" (OuterVolumeSpecName: "utilities") pod "3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" (UID: "3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.053621 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-kube-api-access-lzklz" (OuterVolumeSpecName: "kube-api-access-lzklz") pod "3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" (UID: "3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3"). InnerVolumeSpecName "kube-api-access-lzklz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.120921 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" (UID: "3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.131839 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.131880 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.131890 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzklz\" (UniqueName: \"kubernetes.io/projected/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3-kube-api-access-lzklz\") on node \"crc\" DevicePath \"\"" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.212646 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerStarted","Data":"69baa6659c684a7dfcd9611e529f073b414e84dd61ff3bf532314a565a578fa8"} Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.212697 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerStarted","Data":"f53c85d09184a1d731fff51be48235a6daf3e9e971d6d9f2945d7f0723221ac3"} Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.212709 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerStarted","Data":"7c390f44c779c4714db2bcd50158b6945f4d83591dd1fb2b4a4a6d854e65a420"} Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.212720 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerStarted","Data":"0ae2bca1a2941261233b18964257a13f4ad82c9e325d9a6f7621879deb6dd245"} Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.215615 4852 generic.go:334] "Generic (PLEG): container finished" podID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerID="21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438" exitCode=0 Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.215680 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz6bs" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.215704 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz6bs" event={"ID":"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3","Type":"ContainerDied","Data":"21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438"} Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.215738 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz6bs" event={"ID":"3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3","Type":"ContainerDied","Data":"9af0237cde7d5a7d460fc4042ff947f2684be9912382625dfc62fed36649184e"} Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.215769 4852 scope.go:117] "RemoveContainer" containerID="21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.233435 4852 scope.go:117] "RemoveContainer" containerID="ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.255312 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kz6bs"] Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.258294 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kz6bs"] Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.271811 4852 scope.go:117] "RemoveContainer" containerID="1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.294832 4852 scope.go:117] "RemoveContainer" containerID="21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438" Dec 10 12:08:11 crc kubenswrapper[4852]: E1210 12:08:11.295527 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438\": container with ID starting with 21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438 not found: ID does not exist" containerID="21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.295559 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438"} err="failed to get container status \"21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438\": rpc error: code = NotFound desc = could not find container \"21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438\": container with ID starting with 21440942a75639022ddb851333762ed365827d5c3e3f00dec6a02ef718750438 not found: ID does not exist" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.295587 4852 scope.go:117] "RemoveContainer" containerID="ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa" Dec 10 12:08:11 crc kubenswrapper[4852]: E1210 12:08:11.296019 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa\": container with ID starting with ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa not found: ID does not exist" containerID="ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.296069 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa"} err="failed to get container status \"ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa\": rpc error: code = NotFound desc = could not find container \"ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa\": container with ID starting with ce13c5badafda25f456034de94d2c879a625237f9bcef7241d9893ea575dccfa not found: ID does not exist" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.296102 4852 scope.go:117] "RemoveContainer" containerID="1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032" Dec 10 12:08:11 crc kubenswrapper[4852]: E1210 12:08:11.296474 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032\": container with ID starting with 1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032 not found: ID does not exist" containerID="1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032" Dec 10 12:08:11 crc kubenswrapper[4852]: I1210 12:08:11.296504 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032"} err="failed to get container status \"1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032\": rpc error: code = NotFound desc = could not find container \"1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032\": container with ID starting with 1e42807a316dbbe90e25dad8166958d1e1ec55c8539ee67617caf288a2c9d032 not found: ID does not exist" Dec 10 12:08:12 crc kubenswrapper[4852]: I1210 12:08:12.180313 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" path="/var/lib/kubelet/pods/3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3/volumes" Dec 10 12:08:12 crc kubenswrapper[4852]: I1210 12:08:12.181637 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75639596-a335-4cb1-a5b9-1cab464d990a" path="/var/lib/kubelet/pods/75639596-a335-4cb1-a5b9-1cab464d990a/volumes" Dec 10 12:08:12 crc kubenswrapper[4852]: I1210 12:08:12.228575 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerStarted","Data":"6acfd4e3e581d20f9b98bfa6d0347ea89e035c9056192b7f5767d662c685c442"} Dec 10 12:08:12 crc kubenswrapper[4852]: I1210 12:08:12.228624 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pddhf" event={"ID":"17cd493c-8f5c-4567-8959-cf6ae0011e51","Type":"ContainerStarted","Data":"1115f241bdc10a687f3d2bc78808157509db4770bde467dc84912b853b7e7cb5"} Dec 10 12:08:12 crc kubenswrapper[4852]: I1210 12:08:12.228745 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pddhf" Dec 10 12:08:12 crc kubenswrapper[4852]: I1210 12:08:12.256981 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pddhf" podStartSLOduration=-9223371997.597828 podStartE2EDuration="39.256947765s" podCreationTimestamp="2025-12-10 12:07:33 +0000 UTC" firstStartedPulling="2025-12-10 12:07:34.799025995 +0000 UTC m=+940.884551219" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:08:12.255741895 +0000 UTC m=+978.341267149" watchObservedRunningTime="2025-12-10 12:08:12.256947765 +0000 UTC m=+978.342472999" Dec 10 12:08:14 crc kubenswrapper[4852]: I1210 12:08:14.643640 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pddhf" Dec 10 12:08:14 crc kubenswrapper[4852]: I1210 12:08:14.699997 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pddhf" Dec 10 12:08:18 crc kubenswrapper[4852]: I1210 12:08:18.757892 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:18 crc kubenswrapper[4852]: I1210 12:08:18.760410 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:18 crc kubenswrapper[4852]: I1210 12:08:18.781864 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:19 crc kubenswrapper[4852]: I1210 12:08:19.313930 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nhqss" Dec 10 12:08:24 crc kubenswrapper[4852]: I1210 12:08:24.647054 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pddhf" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.234740 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr"] Dec 10 12:08:33 crc kubenswrapper[4852]: E1210 12:08:33.236425 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerName="extract-utilities" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.236619 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerName="extract-utilities" Dec 10 12:08:33 crc kubenswrapper[4852]: E1210 12:08:33.236714 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75639596-a335-4cb1-a5b9-1cab464d990a" containerName="registry-server" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.236786 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="75639596-a335-4cb1-a5b9-1cab464d990a" containerName="registry-server" Dec 10 12:08:33 crc kubenswrapper[4852]: E1210 12:08:33.236873 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerName="extract-content" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.236945 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerName="extract-content" Dec 10 12:08:33 crc kubenswrapper[4852]: E1210 12:08:33.237036 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerName="registry-server" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.237120 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerName="registry-server" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.237401 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b2230b6-b922-4cba-b7c9-d00ae0cb1bc3" containerName="registry-server" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.237506 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="75639596-a335-4cb1-a5b9-1cab464d990a" containerName="registry-server" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.240158 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.242662 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6svph" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.251672 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr"] Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.260579 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-bundle\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.260720 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jgz4\" (UniqueName: \"kubernetes.io/projected/b2331a1e-5a05-454c-8416-5c475817b166-kube-api-access-8jgz4\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.260814 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-util\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.362198 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-util\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.362433 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-bundle\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.362477 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jgz4\" (UniqueName: \"kubernetes.io/projected/b2331a1e-5a05-454c-8416-5c475817b166-kube-api-access-8jgz4\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.362859 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-util\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.362882 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-bundle\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.380912 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jgz4\" (UniqueName: \"kubernetes.io/projected/b2331a1e-5a05-454c-8416-5c475817b166-kube-api-access-8jgz4\") pod \"bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.559198 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:33 crc kubenswrapper[4852]: I1210 12:08:33.980851 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr"] Dec 10 12:08:34 crc kubenswrapper[4852]: I1210 12:08:34.405382 4852 generic.go:334] "Generic (PLEG): container finished" podID="b2331a1e-5a05-454c-8416-5c475817b166" containerID="a1056e10cb29c98a328dfd3c390611a49caf0798f937dcc1a44ffe13f054131a" exitCode=0 Dec 10 12:08:34 crc kubenswrapper[4852]: I1210 12:08:34.405517 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" event={"ID":"b2331a1e-5a05-454c-8416-5c475817b166","Type":"ContainerDied","Data":"a1056e10cb29c98a328dfd3c390611a49caf0798f937dcc1a44ffe13f054131a"} Dec 10 12:08:34 crc kubenswrapper[4852]: I1210 12:08:34.405768 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" event={"ID":"b2331a1e-5a05-454c-8416-5c475817b166","Type":"ContainerStarted","Data":"02ed20519d439ebdd8affb2df88561c18aec0caf18efcdc3f8e8cb0e15ada5bd"} Dec 10 12:08:35 crc kubenswrapper[4852]: I1210 12:08:35.415668 4852 generic.go:334] "Generic (PLEG): container finished" podID="b2331a1e-5a05-454c-8416-5c475817b166" containerID="0274f4c8660c0f05544f0816659135072773b5c8e627d8bee37c52383c1e6f7c" exitCode=0 Dec 10 12:08:35 crc kubenswrapper[4852]: I1210 12:08:35.415797 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" event={"ID":"b2331a1e-5a05-454c-8416-5c475817b166","Type":"ContainerDied","Data":"0274f4c8660c0f05544f0816659135072773b5c8e627d8bee37c52383c1e6f7c"} Dec 10 12:08:36 crc kubenswrapper[4852]: I1210 12:08:36.427503 4852 generic.go:334] "Generic (PLEG): container finished" podID="b2331a1e-5a05-454c-8416-5c475817b166" containerID="5194e0f2f2a9506951d40410d5a691ff19b15830ac7a62f47573219c211b1aa0" exitCode=0 Dec 10 12:08:36 crc kubenswrapper[4852]: I1210 12:08:36.427599 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" event={"ID":"b2331a1e-5a05-454c-8416-5c475817b166","Type":"ContainerDied","Data":"5194e0f2f2a9506951d40410d5a691ff19b15830ac7a62f47573219c211b1aa0"} Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.674910 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.826344 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jgz4\" (UniqueName: \"kubernetes.io/projected/b2331a1e-5a05-454c-8416-5c475817b166-kube-api-access-8jgz4\") pod \"b2331a1e-5a05-454c-8416-5c475817b166\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.826421 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-bundle\") pod \"b2331a1e-5a05-454c-8416-5c475817b166\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.826481 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-util\") pod \"b2331a1e-5a05-454c-8416-5c475817b166\" (UID: \"b2331a1e-5a05-454c-8416-5c475817b166\") " Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.827690 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-bundle" (OuterVolumeSpecName: "bundle") pod "b2331a1e-5a05-454c-8416-5c475817b166" (UID: "b2331a1e-5a05-454c-8416-5c475817b166"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.834584 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2331a1e-5a05-454c-8416-5c475817b166-kube-api-access-8jgz4" (OuterVolumeSpecName: "kube-api-access-8jgz4") pod "b2331a1e-5a05-454c-8416-5c475817b166" (UID: "b2331a1e-5a05-454c-8416-5c475817b166"). InnerVolumeSpecName "kube-api-access-8jgz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.841538 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-util" (OuterVolumeSpecName: "util") pod "b2331a1e-5a05-454c-8416-5c475817b166" (UID: "b2331a1e-5a05-454c-8416-5c475817b166"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.927518 4852 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-util\") on node \"crc\" DevicePath \"\"" Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.927560 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jgz4\" (UniqueName: \"kubernetes.io/projected/b2331a1e-5a05-454c-8416-5c475817b166-kube-api-access-8jgz4\") on node \"crc\" DevicePath \"\"" Dec 10 12:08:37 crc kubenswrapper[4852]: I1210 12:08:37.927570 4852 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2331a1e-5a05-454c-8416-5c475817b166-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:08:38 crc kubenswrapper[4852]: I1210 12:08:38.443594 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" event={"ID":"b2331a1e-5a05-454c-8416-5c475817b166","Type":"ContainerDied","Data":"02ed20519d439ebdd8affb2df88561c18aec0caf18efcdc3f8e8cb0e15ada5bd"} Dec 10 12:08:38 crc kubenswrapper[4852]: I1210 12:08:38.443648 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ed20519d439ebdd8affb2df88561c18aec0caf18efcdc3f8e8cb0e15ada5bd" Dec 10 12:08:38 crc kubenswrapper[4852]: I1210 12:08:38.443650 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.001116 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf"] Dec 10 12:08:41 crc kubenswrapper[4852]: E1210 12:08:41.001745 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2331a1e-5a05-454c-8416-5c475817b166" containerName="pull" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.001762 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2331a1e-5a05-454c-8416-5c475817b166" containerName="pull" Dec 10 12:08:41 crc kubenswrapper[4852]: E1210 12:08:41.001777 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2331a1e-5a05-454c-8416-5c475817b166" containerName="extract" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.001785 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2331a1e-5a05-454c-8416-5c475817b166" containerName="extract" Dec 10 12:08:41 crc kubenswrapper[4852]: E1210 12:08:41.001803 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2331a1e-5a05-454c-8416-5c475817b166" containerName="util" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.001811 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2331a1e-5a05-454c-8416-5c475817b166" containerName="util" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.001971 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2331a1e-5a05-454c-8416-5c475817b166" containerName="extract" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.002505 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.006302 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mq6mk" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.045048 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf"] Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.170937 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96rp\" (UniqueName: \"kubernetes.io/projected/267779dd-45a9-4ee6-985d-39fb7d7cb207-kube-api-access-f96rp\") pod \"openstack-operator-controller-operator-5bc74577c9-ch9wf\" (UID: \"267779dd-45a9-4ee6-985d-39fb7d7cb207\") " pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.272182 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f96rp\" (UniqueName: \"kubernetes.io/projected/267779dd-45a9-4ee6-985d-39fb7d7cb207-kube-api-access-f96rp\") pod \"openstack-operator-controller-operator-5bc74577c9-ch9wf\" (UID: \"267779dd-45a9-4ee6-985d-39fb7d7cb207\") " pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.294107 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96rp\" (UniqueName: \"kubernetes.io/projected/267779dd-45a9-4ee6-985d-39fb7d7cb207-kube-api-access-f96rp\") pod \"openstack-operator-controller-operator-5bc74577c9-ch9wf\" (UID: \"267779dd-45a9-4ee6-985d-39fb7d7cb207\") " pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.320129 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" Dec 10 12:08:41 crc kubenswrapper[4852]: I1210 12:08:41.776931 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf"] Dec 10 12:08:42 crc kubenswrapper[4852]: I1210 12:08:42.474609 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" event={"ID":"267779dd-45a9-4ee6-985d-39fb7d7cb207","Type":"ContainerStarted","Data":"8bb0fb31fa12ab5ccc5842b0f2d4091f35e018f50fe6fb98d058676510d5081a"} Dec 10 12:08:45 crc kubenswrapper[4852]: I1210 12:08:45.789983 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:08:45 crc kubenswrapper[4852]: I1210 12:08:45.790343 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:08:50 crc kubenswrapper[4852]: I1210 12:08:50.532842 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" event={"ID":"267779dd-45a9-4ee6-985d-39fb7d7cb207","Type":"ContainerStarted","Data":"295d13097033a37e927fd73def40405b6453f79f5c380c104489c61c82392948"} Dec 10 12:08:50 crc kubenswrapper[4852]: I1210 12:08:50.533641 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" Dec 10 12:09:01 crc kubenswrapper[4852]: I1210 12:09:01.324367 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" Dec 10 12:09:01 crc kubenswrapper[4852]: I1210 12:09:01.355443 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5bc74577c9-ch9wf" podStartSLOduration=13.461907515 podStartE2EDuration="21.355423776s" podCreationTimestamp="2025-12-10 12:08:40 +0000 UTC" firstStartedPulling="2025-12-10 12:08:41.786373344 +0000 UTC m=+1007.871898558" lastFinishedPulling="2025-12-10 12:08:49.679889595 +0000 UTC m=+1015.765414819" observedRunningTime="2025-12-10 12:08:50.577899876 +0000 UTC m=+1016.663425110" watchObservedRunningTime="2025-12-10 12:09:01.355423776 +0000 UTC m=+1027.440949000" Dec 10 12:09:15 crc kubenswrapper[4852]: I1210 12:09:15.789901 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:09:15 crc kubenswrapper[4852]: I1210 12:09:15.790484 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.744384 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.745978 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.747799 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-bfbdz" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.755491 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.756630 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.758637 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nb9tx" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.780316 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.781187 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.782848 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-2bc89" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.788204 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.796538 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.797552 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.800702 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5wj84" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.803534 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.809338 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.810283 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.811753 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jdfb8" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.816364 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.836343 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.842967 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.854356 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.855626 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.856635 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88bpd\" (UniqueName: \"kubernetes.io/projected/62e793a9-5b13-4532-90fe-d3313b3cf4d9-kube-api-access-88bpd\") pod \"glance-operator-controller-manager-5697bb5779-rhwzx\" (UID: \"62e793a9-5b13-4532-90fe-d3313b3cf4d9\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.856710 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5gp7\" (UniqueName: \"kubernetes.io/projected/74cd0e4c-bd25-4b22-8b1f-cb3758f446fd-kube-api-access-d5gp7\") pod \"designate-operator-controller-manager-697fb699cf-tlflj\" (UID: \"74cd0e4c-bd25-4b22-8b1f-cb3758f446fd\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.856760 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4d7g\" (UniqueName: \"kubernetes.io/projected/391832bd-03d9-409e-93a0-b8986ed437ff-kube-api-access-b4d7g\") pod \"heat-operator-controller-manager-5f64f6f8bb-j8h26\" (UID: \"391832bd-03d9-409e-93a0-b8986ed437ff\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.856792 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98vxr\" (UniqueName: \"kubernetes.io/projected/88a0620c-81a0-4ad1-ae9a-13eb0d08e10f-kube-api-access-98vxr\") pod \"barbican-operator-controller-manager-7d9dfd778-bx58k\" (UID: \"88a0620c-81a0-4ad1-ae9a-13eb0d08e10f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.856818 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zv4j\" (UniqueName: \"kubernetes.io/projected/97d20a41-52e0-47d5-86fd-0f486080ebf5-kube-api-access-2zv4j\") pod \"cinder-operator-controller-manager-6c677c69b-pnppk\" (UID: \"97d20a41-52e0-47d5-86fd-0f486080ebf5\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.863863 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gzbwf" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.870749 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.871977 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.880651 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.880954 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dsx8w" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.910688 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.917003 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.922705 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.926020 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.931695 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-bfphq" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.945561 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.946723 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.950288 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.950694 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2ctnh" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.958504 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5gp7\" (UniqueName: \"kubernetes.io/projected/74cd0e4c-bd25-4b22-8b1f-cb3758f446fd-kube-api-access-d5gp7\") pod \"designate-operator-controller-manager-697fb699cf-tlflj\" (UID: \"74cd0e4c-bd25-4b22-8b1f-cb3758f446fd\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.958576 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4d7g\" (UniqueName: \"kubernetes.io/projected/391832bd-03d9-409e-93a0-b8986ed437ff-kube-api-access-b4d7g\") pod \"heat-operator-controller-manager-5f64f6f8bb-j8h26\" (UID: \"391832bd-03d9-409e-93a0-b8986ed437ff\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.958612 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98vxr\" (UniqueName: \"kubernetes.io/projected/88a0620c-81a0-4ad1-ae9a-13eb0d08e10f-kube-api-access-98vxr\") pod \"barbican-operator-controller-manager-7d9dfd778-bx58k\" (UID: \"88a0620c-81a0-4ad1-ae9a-13eb0d08e10f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.958636 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zv4j\" (UniqueName: \"kubernetes.io/projected/97d20a41-52e0-47d5-86fd-0f486080ebf5-kube-api-access-2zv4j\") pod \"cinder-operator-controller-manager-6c677c69b-pnppk\" (UID: \"97d20a41-52e0-47d5-86fd-0f486080ebf5\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.958681 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88bpd\" (UniqueName: \"kubernetes.io/projected/62e793a9-5b13-4532-90fe-d3313b3cf4d9-kube-api-access-88bpd\") pod \"glance-operator-controller-manager-5697bb5779-rhwzx\" (UID: \"62e793a9-5b13-4532-90fe-d3313b3cf4d9\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.964795 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.981311 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2"] Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.982676 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.991949 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vv9t2" Dec 10 12:09:20 crc kubenswrapper[4852]: I1210 12:09:20.997481 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5gp7\" (UniqueName: \"kubernetes.io/projected/74cd0e4c-bd25-4b22-8b1f-cb3758f446fd-kube-api-access-d5gp7\") pod \"designate-operator-controller-manager-697fb699cf-tlflj\" (UID: \"74cd0e4c-bd25-4b22-8b1f-cb3758f446fd\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.001424 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88bpd\" (UniqueName: \"kubernetes.io/projected/62e793a9-5b13-4532-90fe-d3313b3cf4d9-kube-api-access-88bpd\") pod \"glance-operator-controller-manager-5697bb5779-rhwzx\" (UID: \"62e793a9-5b13-4532-90fe-d3313b3cf4d9\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.002002 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.005853 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zv4j\" (UniqueName: \"kubernetes.io/projected/97d20a41-52e0-47d5-86fd-0f486080ebf5-kube-api-access-2zv4j\") pod \"cinder-operator-controller-manager-6c677c69b-pnppk\" (UID: \"97d20a41-52e0-47d5-86fd-0f486080ebf5\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.017219 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98vxr\" (UniqueName: \"kubernetes.io/projected/88a0620c-81a0-4ad1-ae9a-13eb0d08e10f-kube-api-access-98vxr\") pod \"barbican-operator-controller-manager-7d9dfd778-bx58k\" (UID: \"88a0620c-81a0-4ad1-ae9a-13eb0d08e10f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.018026 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4d7g\" (UniqueName: \"kubernetes.io/projected/391832bd-03d9-409e-93a0-b8986ed437ff-kube-api-access-b4d7g\") pod \"heat-operator-controller-manager-5f64f6f8bb-j8h26\" (UID: \"391832bd-03d9-409e-93a0-b8986ed437ff\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.026669 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.054571 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.065713 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.065797 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqvn\" (UniqueName: \"kubernetes.io/projected/79986568-4439-4f2a-9dc4-af5fb1a1d787-kube-api-access-vcqvn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pzw5d\" (UID: \"79986568-4439-4f2a-9dc4-af5fb1a1d787\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.065827 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsc6w\" (UniqueName: \"kubernetes.io/projected/3fc3907c-5313-44d8-90dd-155b24156a1b-kube-api-access-wsc6w\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.065914 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728x4\" (UniqueName: \"kubernetes.io/projected/2a5cb708-ca60-4763-bf61-6562a610e6dc-kube-api-access-728x4\") pod \"keystone-operator-controller-manager-7765d96ddf-4kkrb\" (UID: \"2a5cb708-ca60-4763-bf61-6562a610e6dc\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.066287 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjp2k\" (UniqueName: \"kubernetes.io/projected/67ab896e-72eb-4040-9397-2a2bcca37c7e-kube-api-access-xjp2k\") pod \"ironic-operator-controller-manager-967d97867-gbgfx\" (UID: \"67ab896e-72eb-4040-9397-2a2bcca37c7e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.066315 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdrnl\" (UniqueName: \"kubernetes.io/projected/f67c3362-3da1-45f6-8fc6-47e16b206173-kube-api-access-vdrnl\") pod \"horizon-operator-controller-manager-68c6d99b8f-54gtc\" (UID: \"f67c3362-3da1-45f6-8fc6-47e16b206173\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.067248 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.082022 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-nr6hz" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.083460 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.086329 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.095732 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.101724 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.103660 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.105895 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ww7g4" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.110069 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.110461 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.111440 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.113149 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wht22" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.117267 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.134647 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.136219 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.143020 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.144009 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.151818 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mvvzb" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.162643 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.167006 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-728x4\" (UniqueName: \"kubernetes.io/projected/2a5cb708-ca60-4763-bf61-6562a610e6dc-kube-api-access-728x4\") pod \"keystone-operator-controller-manager-7765d96ddf-4kkrb\" (UID: \"2a5cb708-ca60-4763-bf61-6562a610e6dc\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.167062 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vxj6\" (UniqueName: \"kubernetes.io/projected/9c39ec89-c5bf-4cdd-a253-154db7bcf781-kube-api-access-9vxj6\") pod \"manila-operator-controller-manager-5b5fd79c9c-b2jw2\" (UID: \"9c39ec89-c5bf-4cdd-a253-154db7bcf781\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.167085 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjp2k\" (UniqueName: \"kubernetes.io/projected/67ab896e-72eb-4040-9397-2a2bcca37c7e-kube-api-access-xjp2k\") pod \"ironic-operator-controller-manager-967d97867-gbgfx\" (UID: \"67ab896e-72eb-4040-9397-2a2bcca37c7e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.167109 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdrnl\" (UniqueName: \"kubernetes.io/projected/f67c3362-3da1-45f6-8fc6-47e16b206173-kube-api-access-vdrnl\") pod \"horizon-operator-controller-manager-68c6d99b8f-54gtc\" (UID: \"f67c3362-3da1-45f6-8fc6-47e16b206173\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.167133 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.167163 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqvn\" (UniqueName: \"kubernetes.io/projected/79986568-4439-4f2a-9dc4-af5fb1a1d787-kube-api-access-vcqvn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pzw5d\" (UID: \"79986568-4439-4f2a-9dc4-af5fb1a1d787\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.167183 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsc6w\" (UniqueName: \"kubernetes.io/projected/3fc3907c-5313-44d8-90dd-155b24156a1b-kube-api-access-wsc6w\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.167362 4852 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.167408 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert podName:3fc3907c-5313-44d8-90dd-155b24156a1b nodeName:}" failed. No retries permitted until 2025-12-10 12:09:21.667393408 +0000 UTC m=+1047.752918632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert") pod "infra-operator-controller-manager-78d48bff9d-5p988" (UID: "3fc3907c-5313-44d8-90dd-155b24156a1b") : secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.186559 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.187548 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.189859 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-frcmd" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.198610 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-728x4\" (UniqueName: \"kubernetes.io/projected/2a5cb708-ca60-4763-bf61-6562a610e6dc-kube-api-access-728x4\") pod \"keystone-operator-controller-manager-7765d96ddf-4kkrb\" (UID: \"2a5cb708-ca60-4763-bf61-6562a610e6dc\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.199042 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.199619 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdrnl\" (UniqueName: \"kubernetes.io/projected/f67c3362-3da1-45f6-8fc6-47e16b206173-kube-api-access-vdrnl\") pod \"horizon-operator-controller-manager-68c6d99b8f-54gtc\" (UID: \"f67c3362-3da1-45f6-8fc6-47e16b206173\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.201446 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqvn\" (UniqueName: \"kubernetes.io/projected/79986568-4439-4f2a-9dc4-af5fb1a1d787-kube-api-access-vcqvn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pzw5d\" (UID: \"79986568-4439-4f2a-9dc4-af5fb1a1d787\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.201961 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjp2k\" (UniqueName: \"kubernetes.io/projected/67ab896e-72eb-4040-9397-2a2bcca37c7e-kube-api-access-xjp2k\") pod \"ironic-operator-controller-manager-967d97867-gbgfx\" (UID: \"67ab896e-72eb-4040-9397-2a2bcca37c7e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.208390 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsc6w\" (UniqueName: \"kubernetes.io/projected/3fc3907c-5313-44d8-90dd-155b24156a1b-kube-api-access-wsc6w\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.235413 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.239404 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.241184 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n9x4b" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.249514 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.253776 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.254776 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.255776 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.256606 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ltf72" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.256749 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.257175 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.257393 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xxnpd" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.261546 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.269577 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vxj6\" (UniqueName: \"kubernetes.io/projected/9c39ec89-c5bf-4cdd-a253-154db7bcf781-kube-api-access-9vxj6\") pod \"manila-operator-controller-manager-5b5fd79c9c-b2jw2\" (UID: \"9c39ec89-c5bf-4cdd-a253-154db7bcf781\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.269689 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28z48\" (UniqueName: \"kubernetes.io/projected/f53525dc-0dc9-44c5-a947-2e303cb0ed1c-kube-api-access-28z48\") pod \"mariadb-operator-controller-manager-79c8c4686c-nlcxk\" (UID: \"f53525dc-0dc9-44c5-a947-2e303cb0ed1c\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.269727 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltxxn\" (UniqueName: \"kubernetes.io/projected/bf62d827-9a6d-4a53-9a65-b287195f3bea-kube-api-access-ltxxn\") pod \"nova-operator-controller-manager-697bc559fc-p22mj\" (UID: \"bf62d827-9a6d-4a53-9a65-b287195f3bea\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.269766 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvvc9\" (UniqueName: \"kubernetes.io/projected/2a55ad46-c35b-4429-b1da-7a361f7c45d0-kube-api-access-zvvc9\") pod \"octavia-operator-controller-manager-998648c74-xr9c5\" (UID: \"2a55ad46-c35b-4429-b1da-7a361f7c45d0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.270163 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.282463 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.291639 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.306084 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.307150 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.319152 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rz8xx" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.320315 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.337086 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vxj6\" (UniqueName: \"kubernetes.io/projected/9c39ec89-c5bf-4cdd-a253-154db7bcf781-kube-api-access-9vxj6\") pod \"manila-operator-controller-manager-5b5fd79c9c-b2jw2\" (UID: \"9c39ec89-c5bf-4cdd-a253-154db7bcf781\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.341639 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.343254 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.345399 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x56pt" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.349095 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.370813 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2knrc\" (UniqueName: \"kubernetes.io/projected/75a5b678-ba48-4191-99e6-aeeaf32bf40e-kube-api-access-2knrc\") pod \"ovn-operator-controller-manager-b6456fdb6-86mvp\" (UID: \"75a5b678-ba48-4191-99e6-aeeaf32bf40e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.370901 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvsx\" (UniqueName: \"kubernetes.io/projected/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-kube-api-access-2pvsx\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.370931 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28z48\" (UniqueName: \"kubernetes.io/projected/f53525dc-0dc9-44c5-a947-2e303cb0ed1c-kube-api-access-28z48\") pod \"mariadb-operator-controller-manager-79c8c4686c-nlcxk\" (UID: \"f53525dc-0dc9-44c5-a947-2e303cb0ed1c\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.370958 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltxxn\" (UniqueName: \"kubernetes.io/projected/bf62d827-9a6d-4a53-9a65-b287195f3bea-kube-api-access-ltxxn\") pod \"nova-operator-controller-manager-697bc559fc-p22mj\" (UID: \"bf62d827-9a6d-4a53-9a65-b287195f3bea\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.370975 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w22\" (UniqueName: \"kubernetes.io/projected/785eda15-0a5d-451d-8ec4-b35e1f8d8147-kube-api-access-f4w22\") pod \"placement-operator-controller-manager-78f8948974-kc8c8\" (UID: \"785eda15-0a5d-451d-8ec4-b35e1f8d8147\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.371003 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvvc9\" (UniqueName: \"kubernetes.io/projected/2a55ad46-c35b-4429-b1da-7a361f7c45d0-kube-api-access-zvvc9\") pod \"octavia-operator-controller-manager-998648c74-xr9c5\" (UID: \"2a55ad46-c35b-4429-b1da-7a361f7c45d0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.371019 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrksm\" (UniqueName: \"kubernetes.io/projected/31cc1af5-d198-472a-aa62-2ce735f4453b-kube-api-access-hrksm\") pod \"swift-operator-controller-manager-9d58d64bc-q26ll\" (UID: \"31cc1af5-d198-472a-aa62-2ce735f4453b\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.371043 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.381377 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.382477 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.383182 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.386729 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-zktlp" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.391431 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.396752 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.401457 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28z48\" (UniqueName: \"kubernetes.io/projected/f53525dc-0dc9-44c5-a947-2e303cb0ed1c-kube-api-access-28z48\") pod \"mariadb-operator-controller-manager-79c8c4686c-nlcxk\" (UID: \"f53525dc-0dc9-44c5-a947-2e303cb0ed1c\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.420164 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltxxn\" (UniqueName: \"kubernetes.io/projected/bf62d827-9a6d-4a53-9a65-b287195f3bea-kube-api-access-ltxxn\") pod \"nova-operator-controller-manager-697bc559fc-p22mj\" (UID: \"bf62d827-9a6d-4a53-9a65-b287195f3bea\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.420869 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvvc9\" (UniqueName: \"kubernetes.io/projected/2a55ad46-c35b-4429-b1da-7a361f7c45d0-kube-api-access-zvvc9\") pod \"octavia-operator-controller-manager-998648c74-xr9c5\" (UID: \"2a55ad46-c35b-4429-b1da-7a361f7c45d0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.424175 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.448247 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.485393 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvsx\" (UniqueName: \"kubernetes.io/projected/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-kube-api-access-2pvsx\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.485514 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w22\" (UniqueName: \"kubernetes.io/projected/785eda15-0a5d-451d-8ec4-b35e1f8d8147-kube-api-access-f4w22\") pod \"placement-operator-controller-manager-78f8948974-kc8c8\" (UID: \"785eda15-0a5d-451d-8ec4-b35e1f8d8147\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.485580 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrksm\" (UniqueName: \"kubernetes.io/projected/31cc1af5-d198-472a-aa62-2ce735f4453b-kube-api-access-hrksm\") pod \"swift-operator-controller-manager-9d58d64bc-q26ll\" (UID: \"31cc1af5-d198-472a-aa62-2ce735f4453b\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.485626 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.485664 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj2v9\" (UniqueName: \"kubernetes.io/projected/d01d86ae-5138-4298-8ec0-7aa8cdd468fe-kube-api-access-bj2v9\") pod \"watcher-operator-controller-manager-75944c9b7-57gb2\" (UID: \"d01d86ae-5138-4298-8ec0-7aa8cdd468fe\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.485702 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2knrc\" (UniqueName: \"kubernetes.io/projected/75a5b678-ba48-4191-99e6-aeeaf32bf40e-kube-api-access-2knrc\") pod \"ovn-operator-controller-manager-b6456fdb6-86mvp\" (UID: \"75a5b678-ba48-4191-99e6-aeeaf32bf40e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.485759 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp2rf\" (UniqueName: \"kubernetes.io/projected/c7a73ae7-6060-497a-b94f-8988c2244f94-kube-api-access-zp2rf\") pod \"telemetry-operator-controller-manager-58d5ff84df-lhzps\" (UID: \"c7a73ae7-6060-497a-b94f-8988c2244f94\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.485786 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrcxn\" (UniqueName: \"kubernetes.io/projected/6b9c74bb-9c09-4976-be53-8b2c296f7788-kube-api-access-nrcxn\") pod \"test-operator-controller-manager-5854674fcc-zhxkc\" (UID: \"6b9c74bb-9c09-4976-be53-8b2c296f7788\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.487964 4852 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.488015 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert podName:3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:21.987997282 +0000 UTC m=+1048.073522506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f8csbt" (UID: "3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.491882 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.520704 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.522677 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.525451 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.525785 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nl42f" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.526015 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.540732 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.556043 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrksm\" (UniqueName: \"kubernetes.io/projected/31cc1af5-d198-472a-aa62-2ce735f4453b-kube-api-access-hrksm\") pod \"swift-operator-controller-manager-9d58d64bc-q26ll\" (UID: \"31cc1af5-d198-472a-aa62-2ce735f4453b\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.563196 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.566605 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvsx\" (UniqueName: \"kubernetes.io/projected/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-kube-api-access-2pvsx\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.567585 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2knrc\" (UniqueName: \"kubernetes.io/projected/75a5b678-ba48-4191-99e6-aeeaf32bf40e-kube-api-access-2knrc\") pod \"ovn-operator-controller-manager-b6456fdb6-86mvp\" (UID: \"75a5b678-ba48-4191-99e6-aeeaf32bf40e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.567701 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.568608 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.570155 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w22\" (UniqueName: \"kubernetes.io/projected/785eda15-0a5d-451d-8ec4-b35e1f8d8147-kube-api-access-f4w22\") pod \"placement-operator-controller-manager-78f8948974-kc8c8\" (UID: \"785eda15-0a5d-451d-8ec4-b35e1f8d8147\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.570598 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-cfbqd" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.579835 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc"] Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.586504 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj2v9\" (UniqueName: \"kubernetes.io/projected/d01d86ae-5138-4298-8ec0-7aa8cdd468fe-kube-api-access-bj2v9\") pod \"watcher-operator-controller-manager-75944c9b7-57gb2\" (UID: \"d01d86ae-5138-4298-8ec0-7aa8cdd468fe\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.586571 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp2rf\" (UniqueName: \"kubernetes.io/projected/c7a73ae7-6060-497a-b94f-8988c2244f94-kube-api-access-zp2rf\") pod \"telemetry-operator-controller-manager-58d5ff84df-lhzps\" (UID: \"c7a73ae7-6060-497a-b94f-8988c2244f94\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.586597 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrcxn\" (UniqueName: \"kubernetes.io/projected/6b9c74bb-9c09-4976-be53-8b2c296f7788-kube-api-access-nrcxn\") pod \"test-operator-controller-manager-5854674fcc-zhxkc\" (UID: \"6b9c74bb-9c09-4976-be53-8b2c296f7788\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.586622 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.586638 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.586672 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9zw\" (UniqueName: \"kubernetes.io/projected/524d7bc8-a871-4ff2-bc13-1a84d07bb0e9-kube-api-access-ng9zw\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkcjc\" (UID: \"524d7bc8-a871-4ff2-bc13-1a84d07bb0e9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.586707 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24k7z\" (UniqueName: \"kubernetes.io/projected/bbce747f-ad24-476e-8746-f2bb89eba637-kube-api-access-24k7z\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.604852 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj2v9\" (UniqueName: \"kubernetes.io/projected/d01d86ae-5138-4298-8ec0-7aa8cdd468fe-kube-api-access-bj2v9\") pod \"watcher-operator-controller-manager-75944c9b7-57gb2\" (UID: \"d01d86ae-5138-4298-8ec0-7aa8cdd468fe\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.605247 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp2rf\" (UniqueName: \"kubernetes.io/projected/c7a73ae7-6060-497a-b94f-8988c2244f94-kube-api-access-zp2rf\") pod \"telemetry-operator-controller-manager-58d5ff84df-lhzps\" (UID: \"c7a73ae7-6060-497a-b94f-8988c2244f94\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.605387 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrcxn\" (UniqueName: \"kubernetes.io/projected/6b9c74bb-9c09-4976-be53-8b2c296f7788-kube-api-access-nrcxn\") pod \"test-operator-controller-manager-5854674fcc-zhxkc\" (UID: \"6b9c74bb-9c09-4976-be53-8b2c296f7788\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.622558 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.688748 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.688799 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.688851 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9zw\" (UniqueName: \"kubernetes.io/projected/524d7bc8-a871-4ff2-bc13-1a84d07bb0e9-kube-api-access-ng9zw\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkcjc\" (UID: \"524d7bc8-a871-4ff2-bc13-1a84d07bb0e9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.688909 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24k7z\" (UniqueName: \"kubernetes.io/projected/bbce747f-ad24-476e-8746-f2bb89eba637-kube-api-access-24k7z\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.688987 4852 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.688988 4852 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.689018 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.689049 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:22.189030747 +0000 UTC m=+1048.274555971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "webhook-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.689100 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:22.189072238 +0000 UTC m=+1048.274597462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "metrics-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.689123 4852 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: E1210 12:09:21.689163 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert podName:3fc3907c-5313-44d8-90dd-155b24156a1b nodeName:}" failed. No retries permitted until 2025-12-10 12:09:22.68915029 +0000 UTC m=+1048.774675674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert") pod "infra-operator-controller-manager-78d48bff9d-5p988" (UID: "3fc3907c-5313-44d8-90dd-155b24156a1b") : secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.727654 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24k7z\" (UniqueName: \"kubernetes.io/projected/bbce747f-ad24-476e-8746-f2bb89eba637-kube-api-access-24k7z\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.731084 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9zw\" (UniqueName: \"kubernetes.io/projected/524d7bc8-a871-4ff2-bc13-1a84d07bb0e9-kube-api-access-ng9zw\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkcjc\" (UID: \"524d7bc8-a871-4ff2-bc13-1a84d07bb0e9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.752279 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.769070 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.773601 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.874618 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" Dec 10 12:09:21 crc kubenswrapper[4852]: I1210 12:09:21.906010 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.004822 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:22 crc kubenswrapper[4852]: E1210 12:09:22.004966 4852 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:22 crc kubenswrapper[4852]: E1210 12:09:22.005015 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert podName:3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:23.005000005 +0000 UTC m=+1049.090525229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f8csbt" (UID: "3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.044099 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.237783 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.237822 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:22 crc kubenswrapper[4852]: E1210 12:09:22.237932 4852 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 10 12:09:22 crc kubenswrapper[4852]: E1210 12:09:22.237982 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:23.237967186 +0000 UTC m=+1049.323492410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "webhook-server-cert" not found Dec 10 12:09:22 crc kubenswrapper[4852]: E1210 12:09:22.238310 4852 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 10 12:09:22 crc kubenswrapper[4852]: E1210 12:09:22.238339 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:23.238332315 +0000 UTC m=+1049.323857539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "metrics-server-cert" not found Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.318846 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k"] Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.499037 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26"] Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.511662 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk"] Dec 10 12:09:22 crc kubenswrapper[4852]: W1210 12:09:22.515372 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod391832bd_03d9_409e_93a0_b8986ed437ff.slice/crio-f5a971f11ca2d6d116ea330f009ffb539c778ade99ee6e62b439f4f3f6e14745 WatchSource:0}: Error finding container f5a971f11ca2d6d116ea330f009ffb539c778ade99ee6e62b439f4f3f6e14745: Status 404 returned error can't find the container with id f5a971f11ca2d6d116ea330f009ffb539c778ade99ee6e62b439f4f3f6e14745 Dec 10 12:09:22 crc kubenswrapper[4852]: W1210 12:09:22.520298 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d20a41_52e0_47d5_86fd_0f486080ebf5.slice/crio-e1546bdd26e05821392116f91f1dd8e8b10addb651576970aea049be47dfd4b8 WatchSource:0}: Error finding container e1546bdd26e05821392116f91f1dd8e8b10addb651576970aea049be47dfd4b8: Status 404 returned error can't find the container with id e1546bdd26e05821392116f91f1dd8e8b10addb651576970aea049be47dfd4b8 Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.533536 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj"] Dec 10 12:09:22 crc kubenswrapper[4852]: W1210 12:09:22.545368 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74cd0e4c_bd25_4b22_8b1f_cb3758f446fd.slice/crio-a3c787d1c49eb3fd09900c8d1c7b271e81b68e8f3ff62da56d3f1c9341519e3f WatchSource:0}: Error finding container a3c787d1c49eb3fd09900c8d1c7b271e81b68e8f3ff62da56d3f1c9341519e3f: Status 404 returned error can't find the container with id a3c787d1c49eb3fd09900c8d1c7b271e81b68e8f3ff62da56d3f1c9341519e3f Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.546740 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx"] Dec 10 12:09:22 crc kubenswrapper[4852]: W1210 12:09:22.557353 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62e793a9_5b13_4532_90fe_d3313b3cf4d9.slice/crio-8ccdf4a63aed6900c68ab117cd02402f80d090b06b03635f4feee43273765b7d WatchSource:0}: Error finding container 8ccdf4a63aed6900c68ab117cd02402f80d090b06b03635f4feee43273765b7d: Status 404 returned error can't find the container with id 8ccdf4a63aed6900c68ab117cd02402f80d090b06b03635f4feee43273765b7d Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.732342 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" event={"ID":"391832bd-03d9-409e-93a0-b8986ed437ff","Type":"ContainerStarted","Data":"f5a971f11ca2d6d116ea330f009ffb539c778ade99ee6e62b439f4f3f6e14745"} Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.736989 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" event={"ID":"62e793a9-5b13-4532-90fe-d3313b3cf4d9","Type":"ContainerStarted","Data":"8ccdf4a63aed6900c68ab117cd02402f80d090b06b03635f4feee43273765b7d"} Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.742837 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" event={"ID":"97d20a41-52e0-47d5-86fd-0f486080ebf5","Type":"ContainerStarted","Data":"e1546bdd26e05821392116f91f1dd8e8b10addb651576970aea049be47dfd4b8"} Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.746628 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:22 crc kubenswrapper[4852]: E1210 12:09:22.746936 4852 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.746969 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" event={"ID":"88a0620c-81a0-4ad1-ae9a-13eb0d08e10f","Type":"ContainerStarted","Data":"50998480d8d87cb756c007134efd569241226774a2b0557a5080449627e45cc5"} Dec 10 12:09:22 crc kubenswrapper[4852]: E1210 12:09:22.747009 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert podName:3fc3907c-5313-44d8-90dd-155b24156a1b nodeName:}" failed. No retries permitted until 2025-12-10 12:09:24.746988531 +0000 UTC m=+1050.832513755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert") pod "infra-operator-controller-manager-78d48bff9d-5p988" (UID: "3fc3907c-5313-44d8-90dd-155b24156a1b") : secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.748093 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" event={"ID":"74cd0e4c-bd25-4b22-8b1f-cb3758f446fd","Type":"ContainerStarted","Data":"a3c787d1c49eb3fd09900c8d1c7b271e81b68e8f3ff62da56d3f1c9341519e3f"} Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.785085 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2"] Dec 10 12:09:22 crc kubenswrapper[4852]: W1210 12:09:22.797294 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c39ec89_c5bf_4cdd_a253_154db7bcf781.slice/crio-d906630d6368e21d8bf08f7e1cde966ebf53ed264bfa1e0595205aa45483d4e9 WatchSource:0}: Error finding container d906630d6368e21d8bf08f7e1cde966ebf53ed264bfa1e0595205aa45483d4e9: Status 404 returned error can't find the container with id d906630d6368e21d8bf08f7e1cde966ebf53ed264bfa1e0595205aa45483d4e9 Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.969433 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll"] Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.980603 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk"] Dec 10 12:09:22 crc kubenswrapper[4852]: I1210 12:09:22.994938 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2"] Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.004186 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp"] Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.050618 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc"] Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.051323 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.051431 4852 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.051468 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert podName:3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:25.051455362 +0000 UTC m=+1051.136980586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f8csbt" (UID: "3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.056166 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj"] Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.059099 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4w22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-kc8c8_openstack-operators(785eda15-0a5d-451d-8ec4-b35e1f8d8147): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.060540 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ltxxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-p22mj_openstack-operators(bf62d827-9a6d-4a53-9a65-b287195f3bea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.061395 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4w22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-kc8c8_openstack-operators(785eda15-0a5d-451d-8ec4-b35e1f8d8147): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: W1210 12:09:23.062165 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ab896e_72eb_4040_9397_2a2bcca37c7e.slice/crio-992356d51f2e180efb7fe7159fce95c1aa0b632157d8f80e325e1c8150a818a0 WatchSource:0}: Error finding container 992356d51f2e180efb7fe7159fce95c1aa0b632157d8f80e325e1c8150a818a0: Status 404 returned error can't find the container with id 992356d51f2e180efb7fe7159fce95c1aa0b632157d8f80e325e1c8150a818a0 Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.062460 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" podUID="785eda15-0a5d-451d-8ec4-b35e1f8d8147" Dec 10 12:09:23 crc kubenswrapper[4852]: W1210 12:09:23.063306 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a5cb708_ca60_4763_bf61_6562a610e6dc.slice/crio-b9cdebe1875580829eb973bf035f9a61c831605f2b08981ac33619b788b1b428 WatchSource:0}: Error finding container b9cdebe1875580829eb973bf035f9a61c831605f2b08981ac33619b788b1b428: Status 404 returned error can't find the container with id b9cdebe1875580829eb973bf035f9a61c831605f2b08981ac33619b788b1b428 Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.063852 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ltxxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-p22mj_openstack-operators(bf62d827-9a6d-4a53-9a65-b287195f3bea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.065045 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" podUID="bf62d827-9a6d-4a53-9a65-b287195f3bea" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.068796 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvvc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-xr9c5_openstack-operators(2a55ad46-c35b-4429-b1da-7a361f7c45d0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.069042 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xjp2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-gbgfx_openstack-operators(67ab896e-72eb-4040-9397-2a2bcca37c7e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.071867 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8"] Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.074539 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xjp2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-gbgfx_openstack-operators(67ab896e-72eb-4040-9397-2a2bcca37c7e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.075649 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" podUID="67ab896e-72eb-4040-9397-2a2bcca37c7e" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.076222 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-728x4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-4kkrb_openstack-operators(2a5cb708-ca60-4763-bf61-6562a610e6dc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.076210 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvvc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-xr9c5_openstack-operators(2a55ad46-c35b-4429-b1da-7a361f7c45d0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.077905 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-728x4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-4kkrb_openstack-operators(2a5cb708-ca60-4763-bf61-6562a610e6dc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.077992 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" podUID="2a55ad46-c35b-4429-b1da-7a361f7c45d0" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.079680 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" podUID="2a5cb708-ca60-4763-bf61-6562a610e6dc" Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.086862 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx"] Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.095973 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc"] Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.106349 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d"] Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.115428 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb"] Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.123700 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5"] Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.212556 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps"] Dec 10 12:09:23 crc kubenswrapper[4852]: W1210 12:09:23.216171 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7a73ae7_6060_497a_b94f_8988c2244f94.slice/crio-91c94dc4f34078693fc047ec35a7474910ca2ffbffff8a83b394304f8029a5ce WatchSource:0}: Error finding container 91c94dc4f34078693fc047ec35a7474910ca2ffbffff8a83b394304f8029a5ce: Status 404 returned error can't find the container with id 91c94dc4f34078693fc047ec35a7474910ca2ffbffff8a83b394304f8029a5ce Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.220018 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc"] Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.245358 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zp2rf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-lhzps_openstack-operators(c7a73ae7-6060-497a-b94f-8988c2244f94): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.249988 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zp2rf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-lhzps_openstack-operators(c7a73ae7-6060-497a-b94f-8988c2244f94): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.251776 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" podUID="c7a73ae7-6060-497a-b94f-8988c2244f94" Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.253852 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.253896 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.254054 4852 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.254112 4852 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.254119 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:25.254098758 +0000 UTC m=+1051.339624052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "metrics-server-cert" not found Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.254174 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:25.2541603 +0000 UTC m=+1051.339685524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "webhook-server-cert" not found Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.755441 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" event={"ID":"79986568-4439-4f2a-9dc4-af5fb1a1d787","Type":"ContainerStarted","Data":"558498830a00473d6ce46857099c0b8ab3e523306197f3bf243629f312da9a45"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.756779 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" event={"ID":"67ab896e-72eb-4040-9397-2a2bcca37c7e","Type":"ContainerStarted","Data":"992356d51f2e180efb7fe7159fce95c1aa0b632157d8f80e325e1c8150a818a0"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.758119 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" event={"ID":"75a5b678-ba48-4191-99e6-aeeaf32bf40e","Type":"ContainerStarted","Data":"b8ef167d625fc0be2681f6ad0d4838edb253dc0eb5040862100d534275721d05"} Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.758673 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" podUID="67ab896e-72eb-4040-9397-2a2bcca37c7e" Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.759279 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" event={"ID":"f67c3362-3da1-45f6-8fc6-47e16b206173","Type":"ContainerStarted","Data":"5c63e32282f7074fc75d91a6f14ea8a4f6c4f843a3a7c91dac987a8f1b4c863d"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.760364 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" event={"ID":"c7a73ae7-6060-497a-b94f-8988c2244f94","Type":"ContainerStarted","Data":"91c94dc4f34078693fc047ec35a7474910ca2ffbffff8a83b394304f8029a5ce"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.761315 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" event={"ID":"785eda15-0a5d-451d-8ec4-b35e1f8d8147","Type":"ContainerStarted","Data":"2045883e0270e99f213ee3023111fe1f683ba4c1b85ec2d234e2133c057c2d38"} Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.768957 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" podUID="c7a73ae7-6060-497a-b94f-8988c2244f94" Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.769450 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" podUID="785eda15-0a5d-451d-8ec4-b35e1f8d8147" Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.770716 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" event={"ID":"2a5cb708-ca60-4763-bf61-6562a610e6dc","Type":"ContainerStarted","Data":"b9cdebe1875580829eb973bf035f9a61c831605f2b08981ac33619b788b1b428"} Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.771909 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" podUID="2a5cb708-ca60-4763-bf61-6562a610e6dc" Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.772213 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" event={"ID":"9c39ec89-c5bf-4cdd-a253-154db7bcf781","Type":"ContainerStarted","Data":"d906630d6368e21d8bf08f7e1cde966ebf53ed264bfa1e0595205aa45483d4e9"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.773438 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" event={"ID":"f53525dc-0dc9-44c5-a947-2e303cb0ed1c","Type":"ContainerStarted","Data":"d6633951d186b950c904918425b4fd25ce18b6dde1f5ac8696a57707ef176f57"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.774116 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" event={"ID":"6b9c74bb-9c09-4976-be53-8b2c296f7788","Type":"ContainerStarted","Data":"5c6b27c73cb5e333334f7d9b7dc1b8cf9dcabc08005a9707ff2c6ba92b125646"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.774910 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" event={"ID":"bf62d827-9a6d-4a53-9a65-b287195f3bea","Type":"ContainerStarted","Data":"f22fa20daadf1f1e2178b8406ae2629d54c04500133af1b8e0d8c986fd44dedf"} Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.776428 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" podUID="bf62d827-9a6d-4a53-9a65-b287195f3bea" Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.777008 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" event={"ID":"31cc1af5-d198-472a-aa62-2ce735f4453b","Type":"ContainerStarted","Data":"f6ce996ced64ec4948b8bf507bf1e7a48db40d3756c049e0e8c64bafdf4ab72d"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.779392 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" event={"ID":"524d7bc8-a871-4ff2-bc13-1a84d07bb0e9","Type":"ContainerStarted","Data":"6eeb739eb09fb00d996b09926622fe0d801d612c09c8e2c0f7e755d7627cdf8d"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.780427 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" event={"ID":"d01d86ae-5138-4298-8ec0-7aa8cdd468fe","Type":"ContainerStarted","Data":"ab57d67d6aa06e743bf7b0bc5f10a3f35df4cc17d1482a123279c74c1e6177a2"} Dec 10 12:09:23 crc kubenswrapper[4852]: I1210 12:09:23.781291 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" event={"ID":"2a55ad46-c35b-4429-b1da-7a361f7c45d0","Type":"ContainerStarted","Data":"5ff2aa20f5ec4658d89314d582ffd55a407a228d6503a83a5467c431c037fff1"} Dec 10 12:09:23 crc kubenswrapper[4852]: E1210 12:09:23.784193 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" podUID="2a55ad46-c35b-4429-b1da-7a361f7c45d0" Dec 10 12:09:24 crc kubenswrapper[4852]: I1210 12:09:24.775355 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:24 crc kubenswrapper[4852]: E1210 12:09:24.775581 4852 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:24 crc kubenswrapper[4852]: E1210 12:09:24.775680 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert podName:3fc3907c-5313-44d8-90dd-155b24156a1b nodeName:}" failed. No retries permitted until 2025-12-10 12:09:28.775656245 +0000 UTC m=+1054.861181469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert") pod "infra-operator-controller-manager-78d48bff9d-5p988" (UID: "3fc3907c-5313-44d8-90dd-155b24156a1b") : secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:24 crc kubenswrapper[4852]: E1210 12:09:24.789864 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" podUID="2a5cb708-ca60-4763-bf61-6562a610e6dc" Dec 10 12:09:24 crc kubenswrapper[4852]: E1210 12:09:24.789961 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" podUID="bf62d827-9a6d-4a53-9a65-b287195f3bea" Dec 10 12:09:24 crc kubenswrapper[4852]: E1210 12:09:24.790048 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" podUID="67ab896e-72eb-4040-9397-2a2bcca37c7e" Dec 10 12:09:24 crc kubenswrapper[4852]: E1210 12:09:24.790158 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" podUID="2a55ad46-c35b-4429-b1da-7a361f7c45d0" Dec 10 12:09:24 crc kubenswrapper[4852]: E1210 12:09:24.790260 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" podUID="c7a73ae7-6060-497a-b94f-8988c2244f94" Dec 10 12:09:24 crc kubenswrapper[4852]: E1210 12:09:24.791280 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" podUID="785eda15-0a5d-451d-8ec4-b35e1f8d8147" Dec 10 12:09:25 crc kubenswrapper[4852]: I1210 12:09:25.082107 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:25 crc kubenswrapper[4852]: E1210 12:09:25.082394 4852 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:25 crc kubenswrapper[4852]: E1210 12:09:25.082479 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert podName:3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:29.082455844 +0000 UTC m=+1055.167981118 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f8csbt" (UID: "3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:25 crc kubenswrapper[4852]: I1210 12:09:25.284335 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:25 crc kubenswrapper[4852]: I1210 12:09:25.285035 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:25 crc kubenswrapper[4852]: E1210 12:09:25.284998 4852 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 10 12:09:25 crc kubenswrapper[4852]: E1210 12:09:25.285473 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:29.285455479 +0000 UTC m=+1055.370980703 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "metrics-server-cert" not found Dec 10 12:09:25 crc kubenswrapper[4852]: E1210 12:09:25.285360 4852 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 10 12:09:25 crc kubenswrapper[4852]: E1210 12:09:25.285831 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:29.285820778 +0000 UTC m=+1055.371346002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "webhook-server-cert" not found Dec 10 12:09:28 crc kubenswrapper[4852]: I1210 12:09:28.845331 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:28 crc kubenswrapper[4852]: E1210 12:09:28.845628 4852 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:28 crc kubenswrapper[4852]: E1210 12:09:28.845890 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert podName:3fc3907c-5313-44d8-90dd-155b24156a1b nodeName:}" failed. No retries permitted until 2025-12-10 12:09:36.845872365 +0000 UTC m=+1062.931397589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert") pod "infra-operator-controller-manager-78d48bff9d-5p988" (UID: "3fc3907c-5313-44d8-90dd-155b24156a1b") : secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:29 crc kubenswrapper[4852]: I1210 12:09:29.149614 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:29 crc kubenswrapper[4852]: E1210 12:09:29.149830 4852 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:29 crc kubenswrapper[4852]: E1210 12:09:29.149879 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert podName:3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:37.149865114 +0000 UTC m=+1063.235390328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f8csbt" (UID: "3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 10 12:09:29 crc kubenswrapper[4852]: I1210 12:09:29.353267 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:29 crc kubenswrapper[4852]: I1210 12:09:29.353324 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:29 crc kubenswrapper[4852]: E1210 12:09:29.353487 4852 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 10 12:09:29 crc kubenswrapper[4852]: E1210 12:09:29.353500 4852 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 10 12:09:29 crc kubenswrapper[4852]: E1210 12:09:29.353586 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:37.353559846 +0000 UTC m=+1063.439085070 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "metrics-server-cert" not found Dec 10 12:09:29 crc kubenswrapper[4852]: E1210 12:09:29.353705 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs podName:bbce747f-ad24-476e-8746-f2bb89eba637 nodeName:}" failed. No retries permitted until 2025-12-10 12:09:37.353674149 +0000 UTC m=+1063.439199453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs") pod "openstack-operator-controller-manager-6d7c94c9c8-s6npl" (UID: "bbce747f-ad24-476e-8746-f2bb89eba637") : secret "webhook-server-cert" not found Dec 10 12:09:36 crc kubenswrapper[4852]: I1210 12:09:36.881192 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:36 crc kubenswrapper[4852]: E1210 12:09:36.881369 4852 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:36 crc kubenswrapper[4852]: E1210 12:09:36.881850 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert podName:3fc3907c-5313-44d8-90dd-155b24156a1b nodeName:}" failed. No retries permitted until 2025-12-10 12:09:52.881829552 +0000 UTC m=+1078.967354776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert") pod "infra-operator-controller-manager-78d48bff9d-5p988" (UID: "3fc3907c-5313-44d8-90dd-155b24156a1b") : secret "infra-operator-webhook-server-cert" not found Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.185368 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.190661 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f8csbt\" (UID: \"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.258051 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.388632 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.388674 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.393862 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-webhook-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.395983 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbce747f-ad24-476e-8746-f2bb89eba637-metrics-certs\") pod \"openstack-operator-controller-manager-6d7c94c9c8-s6npl\" (UID: \"bbce747f-ad24-476e-8746-f2bb89eba637\") " pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.530043 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:09:37 crc kubenswrapper[4852]: E1210 12:09:37.689790 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3" Dec 10 12:09:37 crc kubenswrapper[4852]: E1210 12:09:37.689976 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2zv4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6c677c69b-pnppk_openstack-operators(97d20a41-52e0-47d5-86fd-0f486080ebf5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:09:37 crc kubenswrapper[4852]: I1210 12:09:37.737608 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:09:38 crc kubenswrapper[4852]: E1210 12:09:38.323097 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a" Dec 10 12:09:38 crc kubenswrapper[4852]: E1210 12:09:38.323588 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bj2v9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75944c9b7-57gb2_openstack-operators(d01d86ae-5138-4298-8ec0-7aa8cdd468fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:09:39 crc kubenswrapper[4852]: E1210 12:09:39.055022 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 10 12:09:39 crc kubenswrapper[4852]: E1210 12:09:39.055221 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vcqvn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-pzw5d_openstack-operators(79986568-4439-4f2a-9dc4-af5fb1a1d787): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:09:39 crc kubenswrapper[4852]: E1210 12:09:39.816870 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 10 12:09:39 crc kubenswrapper[4852]: E1210 12:09:39.817040 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vdrnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-54gtc_openstack-operators(f67c3362-3da1-45f6-8fc6-47e16b206173): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:09:45 crc kubenswrapper[4852]: I1210 12:09:45.789582 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:09:45 crc kubenswrapper[4852]: I1210 12:09:45.790134 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:09:45 crc kubenswrapper[4852]: I1210 12:09:45.790184 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:09:45 crc kubenswrapper[4852]: I1210 12:09:45.790856 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1526ef670a66096e17d3fb224d460b0768d7f2150066a4bc7f3d701b213bd881"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:09:45 crc kubenswrapper[4852]: I1210 12:09:45.790908 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://1526ef670a66096e17d3fb224d460b0768d7f2150066a4bc7f3d701b213bd881" gracePeriod=600 Dec 10 12:09:51 crc kubenswrapper[4852]: I1210 12:09:51.196838 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="1526ef670a66096e17d3fb224d460b0768d7f2150066a4bc7f3d701b213bd881" exitCode=0 Dec 10 12:09:51 crc kubenswrapper[4852]: I1210 12:09:51.196924 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"1526ef670a66096e17d3fb224d460b0768d7f2150066a4bc7f3d701b213bd881"} Dec 10 12:09:51 crc kubenswrapper[4852]: I1210 12:09:51.197201 4852 scope.go:117] "RemoveContainer" containerID="77c211572bcee4c8a77c07da48869683ba7551ebec91c3aa4c5542663748ddba" Dec 10 12:09:52 crc kubenswrapper[4852]: I1210 12:09:52.921343 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:52 crc kubenswrapper[4852]: I1210 12:09:52.929459 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fc3907c-5313-44d8-90dd-155b24156a1b-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5p988\" (UID: \"3fc3907c-5313-44d8-90dd-155b24156a1b\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:53 crc kubenswrapper[4852]: I1210 12:09:53.003782 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:09:54 crc kubenswrapper[4852]: E1210 12:09:54.355818 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 10 12:09:54 crc kubenswrapper[4852]: E1210 12:09:54.356023 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ng9zw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xkcjc_openstack-operators(524d7bc8-a871-4ff2-bc13-1a84d07bb0e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:09:54 crc kubenswrapper[4852]: E1210 12:09:54.357279 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" podUID="524d7bc8-a871-4ff2-bc13-1a84d07bb0e9" Dec 10 12:09:55 crc kubenswrapper[4852]: E1210 12:09:55.129488 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f" Dec 10 12:09:55 crc kubenswrapper[4852]: E1210 12:09:55.129963 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zp2rf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-lhzps_openstack-operators(c7a73ae7-6060-497a-b94f-8988c2244f94): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:09:55 crc kubenswrapper[4852]: E1210 12:09:55.224199 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" podUID="524d7bc8-a871-4ff2-bc13-1a84d07bb0e9" Dec 10 12:09:57 crc kubenswrapper[4852]: I1210 12:09:57.152888 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt"] Dec 10 12:09:57 crc kubenswrapper[4852]: E1210 12:09:57.511430 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 10 12:09:57 crc kubenswrapper[4852]: E1210 12:09:57.511623 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvvc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-xr9c5_openstack-operators(2a55ad46-c35b-4429-b1da-7a361f7c45d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:09:58 crc kubenswrapper[4852]: E1210 12:09:58.887258 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 10 12:09:58 crc kubenswrapper[4852]: E1210 12:09:58.887780 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4w22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-kc8c8_openstack-operators(785eda15-0a5d-451d-8ec4-b35e1f8d8147): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:09:59 crc kubenswrapper[4852]: I1210 12:09:59.349660 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl"] Dec 10 12:09:59 crc kubenswrapper[4852]: E1210 12:09:59.626066 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 10 12:09:59 crc kubenswrapper[4852]: E1210 12:09:59.626312 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ltxxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-p22mj_openstack-operators(bf62d827-9a6d-4a53-9a65-b287195f3bea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:10:00 crc kubenswrapper[4852]: W1210 12:10:00.210638 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ded9ed7_ec7c_4550_a840_3dd4d7c1c4b2.slice/crio-a8a85de2fa69a0d781c341db3aa5c580e05981e9710b805dc4f7a7ef72c46591 WatchSource:0}: Error finding container a8a85de2fa69a0d781c341db3aa5c580e05981e9710b805dc4f7a7ef72c46591: Status 404 returned error can't find the container with id a8a85de2fa69a0d781c341db3aa5c580e05981e9710b805dc4f7a7ef72c46591 Dec 10 12:10:00 crc kubenswrapper[4852]: W1210 12:10:00.226511 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbce747f_ad24_476e_8746_f2bb89eba637.slice/crio-76cfacc24ae4145e002305125245b2bc54969f6b95eb92764ae6ee5a35756663 WatchSource:0}: Error finding container 76cfacc24ae4145e002305125245b2bc54969f6b95eb92764ae6ee5a35756663: Status 404 returned error can't find the container with id 76cfacc24ae4145e002305125245b2bc54969f6b95eb92764ae6ee5a35756663 Dec 10 12:10:00 crc kubenswrapper[4852]: I1210 12:10:00.252355 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" event={"ID":"bbce747f-ad24-476e-8746-f2bb89eba637","Type":"ContainerStarted","Data":"76cfacc24ae4145e002305125245b2bc54969f6b95eb92764ae6ee5a35756663"} Dec 10 12:10:00 crc kubenswrapper[4852]: I1210 12:10:00.253119 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" event={"ID":"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2","Type":"ContainerStarted","Data":"a8a85de2fa69a0d781c341db3aa5c580e05981e9710b805dc4f7a7ef72c46591"} Dec 10 12:10:00 crc kubenswrapper[4852]: I1210 12:10:00.848255 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988"] Dec 10 12:10:01 crc kubenswrapper[4852]: I1210 12:10:01.278439 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" event={"ID":"75a5b678-ba48-4191-99e6-aeeaf32bf40e","Type":"ContainerStarted","Data":"de568c200b7eb3356a2f158368b9de4de907d1688011aa8fec1c8ab9e3b8e5c7"} Dec 10 12:10:01 crc kubenswrapper[4852]: I1210 12:10:01.291764 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" event={"ID":"f53525dc-0dc9-44c5-a947-2e303cb0ed1c","Type":"ContainerStarted","Data":"2cf66f9dacb03471a9d11087f90991e74ff2ac099d060fc02b8911c5ffaea651"} Dec 10 12:10:01 crc kubenswrapper[4852]: I1210 12:10:01.302020 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" event={"ID":"391832bd-03d9-409e-93a0-b8986ed437ff","Type":"ContainerStarted","Data":"d06d09cc7ef2868a22fbaa0607ff924d84659d15216c5768847116daa1761c02"} Dec 10 12:10:01 crc kubenswrapper[4852]: I1210 12:10:01.313940 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" event={"ID":"88a0620c-81a0-4ad1-ae9a-13eb0d08e10f","Type":"ContainerStarted","Data":"5d908f692193ddb16e348d5a595551710dc7066e2808fd4ade59b8ba09e4d7c3"} Dec 10 12:10:01 crc kubenswrapper[4852]: I1210 12:10:01.316762 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" event={"ID":"6b9c74bb-9c09-4976-be53-8b2c296f7788","Type":"ContainerStarted","Data":"14b568ab47c9bb4e7f946e2187fd83b9e30f1e48f8b51b831abc1fac459b90c5"} Dec 10 12:10:01 crc kubenswrapper[4852]: I1210 12:10:01.321517 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" event={"ID":"31cc1af5-d198-472a-aa62-2ce735f4453b","Type":"ContainerStarted","Data":"ec63a3552faed7b2e919b07c3a1fc21d2e645d9228b1dfdd0b6ac9f4cae6e904"} Dec 10 12:10:01 crc kubenswrapper[4852]: I1210 12:10:01.323081 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" event={"ID":"74cd0e4c-bd25-4b22-8b1f-cb3758f446fd","Type":"ContainerStarted","Data":"4ff6d4b9939d442fcb010c569c823e49cea3c77d49606a2a0a3d74a7fa9c0927"} Dec 10 12:10:01 crc kubenswrapper[4852]: E1210 12:10:01.931018 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 10 12:10:01 crc kubenswrapper[4852]: E1210 12:10:01.931167 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bj2v9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75944c9b7-57gb2_openstack-operators(d01d86ae-5138-4298-8ec0-7aa8cdd468fe): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 10 12:10:01 crc kubenswrapper[4852]: E1210 12:10:01.932945 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" podUID="d01d86ae-5138-4298-8ec0-7aa8cdd468fe" Dec 10 12:10:02 crc kubenswrapper[4852]: I1210 12:10:02.332599 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"15e58a6d5758dde8e8be6570ea8629914b8054e6378a86d3d8b1552b7be80d78"} Dec 10 12:10:02 crc kubenswrapper[4852]: I1210 12:10:02.336114 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" event={"ID":"62e793a9-5b13-4532-90fe-d3313b3cf4d9","Type":"ContainerStarted","Data":"603bc74147a98645ca8517a02e21ac2e0880d9e06e88ba2fa6085a2c7a92f424"} Dec 10 12:10:02 crc kubenswrapper[4852]: I1210 12:10:02.337575 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" event={"ID":"3fc3907c-5313-44d8-90dd-155b24156a1b","Type":"ContainerStarted","Data":"6f9840db060c4ab358536ae16a3d2f478e3e2b0b9a69482afed7cfd52000b189"} Dec 10 12:10:02 crc kubenswrapper[4852]: E1210 12:10:02.747050 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 10 12:10:02 crc kubenswrapper[4852]: E1210 12:10:02.747190 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2zv4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6c677c69b-pnppk_openstack-operators(97d20a41-52e0-47d5-86fd-0f486080ebf5): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 10 12:10:02 crc kubenswrapper[4852]: E1210 12:10:02.748678 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" podUID="97d20a41-52e0-47d5-86fd-0f486080ebf5" Dec 10 12:10:03 crc kubenswrapper[4852]: I1210 12:10:03.345167 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" event={"ID":"bbce747f-ad24-476e-8746-f2bb89eba637","Type":"ContainerStarted","Data":"c4c08343b5cc56985b31d08e6c0bfbf22e44d8dc2e842132d4271733dd412be7"} Dec 10 12:10:03 crc kubenswrapper[4852]: I1210 12:10:03.345625 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:10:03 crc kubenswrapper[4852]: I1210 12:10:03.347577 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" event={"ID":"9c39ec89-c5bf-4cdd-a253-154db7bcf781","Type":"ContainerStarted","Data":"98b9c187fc59879151e2a29e18794d04d2e5a69adc1a9bc55face15a23c235a4"} Dec 10 12:10:03 crc kubenswrapper[4852]: I1210 12:10:03.349908 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" event={"ID":"67ab896e-72eb-4040-9397-2a2bcca37c7e","Type":"ContainerStarted","Data":"f0d5b2d623b77b259a96f5068555c48e853c9381ffea199bd16be563f8419bc4"} Dec 10 12:10:03 crc kubenswrapper[4852]: I1210 12:10:03.410098 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" podStartSLOduration=42.410079991 podStartE2EDuration="42.410079991s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:10:03.387380456 +0000 UTC m=+1089.472905700" watchObservedRunningTime="2025-12-10 12:10:03.410079991 +0000 UTC m=+1089.495605215" Dec 10 12:10:07 crc kubenswrapper[4852]: E1210 12:10:07.361151 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 10 12:10:07 crc kubenswrapper[4852]: E1210 12:10:07.362122 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vdrnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-54gtc_openstack-operators(f67c3362-3da1-45f6-8fc6-47e16b206173): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 10 12:10:07 crc kubenswrapper[4852]: E1210 12:10:07.363458 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" podUID="f67c3362-3da1-45f6-8fc6-47e16b206173" Dec 10 12:10:07 crc kubenswrapper[4852]: I1210 12:10:07.536114 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6d7c94c9c8-s6npl" Dec 10 12:10:10 crc kubenswrapper[4852]: E1210 12:10:10.344753 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 10 12:10:10 crc kubenswrapper[4852]: E1210 12:10:10.345397 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zp2rf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-lhzps_openstack-operators(c7a73ae7-6060-497a-b94f-8988c2244f94): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 10 12:10:10 crc kubenswrapper[4852]: E1210 12:10:10.347070 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" podUID="c7a73ae7-6060-497a-b94f-8988c2244f94" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.466919 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" event={"ID":"75a5b678-ba48-4191-99e6-aeeaf32bf40e","Type":"ContainerStarted","Data":"c76ef3dd2f6d69bc6eac335a68a195b53ca364b48027738c7231f3843498e038"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.470508 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.471017 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.472193 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" event={"ID":"97d20a41-52e0-47d5-86fd-0f486080ebf5","Type":"ContainerStarted","Data":"e73d9ce0371af12539089b2044f904bc69283794d25c99af9005c1511fb05edc"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.474275 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" event={"ID":"2a5cb708-ca60-4763-bf61-6562a610e6dc","Type":"ContainerStarted","Data":"5745f382bdd7dc67828256bd8fb7422acfdafbd913e4571a278ae4c1c058b001"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.474302 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" event={"ID":"2a5cb708-ca60-4763-bf61-6562a610e6dc","Type":"ContainerStarted","Data":"c6f85d62d4ae3416e71f618a7546440128b3a5b2de8f99fb27fbbc88cf30181b"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.474963 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.476384 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" event={"ID":"f67c3362-3da1-45f6-8fc6-47e16b206173","Type":"ContainerStarted","Data":"9cdd81dfeca1fed96409be0439dc5588a676c53150fa6091a5cfa5a33d945cc0"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.477708 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" event={"ID":"31cc1af5-d198-472a-aa62-2ce735f4453b","Type":"ContainerStarted","Data":"7c27f245194500e2fe0a3964d4df34806100346933d10ecad0552eea3a692f90"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.479359 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.480034 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.480884 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" event={"ID":"d01d86ae-5138-4298-8ec0-7aa8cdd468fe","Type":"ContainerStarted","Data":"7ff5b6a5271823673afbc690be3bbfe7d5cdba2538d1fd8c54f65c381014afd8"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.482147 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" event={"ID":"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2","Type":"ContainerStarted","Data":"e34d320a4e093707f6782a40b0becadbf334f8a6451ab743628af02c2f10f06a"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.483339 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" event={"ID":"3fc3907c-5313-44d8-90dd-155b24156a1b","Type":"ContainerStarted","Data":"126267e6cf1f213655c089ad90afdc6c62d2ed5c3d88a88620455be2c9104b22"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.484700 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" event={"ID":"524d7bc8-a871-4ff2-bc13-1a84d07bb0e9","Type":"ContainerStarted","Data":"05960a45f13e5789ab65c1c64ee87a3e1a62204aee86d0f473441833b6c942e7"} Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.589217 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-86mvp" podStartSLOduration=2.769652186 podStartE2EDuration="1m0.589201305s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.033758282 +0000 UTC m=+1049.119283506" lastFinishedPulling="2025-12-10 12:10:20.853307401 +0000 UTC m=+1106.938832625" observedRunningTime="2025-12-10 12:10:21.588761664 +0000 UTC m=+1107.674286878" watchObservedRunningTime="2025-12-10 12:10:21.589201305 +0000 UTC m=+1107.674726529" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.629828 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkcjc" podStartSLOduration=3.506758279 podStartE2EDuration="1m0.629807797s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.245057523 +0000 UTC m=+1049.330582747" lastFinishedPulling="2025-12-10 12:10:20.368107021 +0000 UTC m=+1106.453632265" observedRunningTime="2025-12-10 12:10:21.626818682 +0000 UTC m=+1107.712343906" watchObservedRunningTime="2025-12-10 12:10:21.629807797 +0000 UTC m=+1107.715333021" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.685888 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" podStartSLOduration=11.823247061 podStartE2EDuration="1m1.685873313s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.076072455 +0000 UTC m=+1049.161597679" lastFinishedPulling="2025-12-10 12:10:12.938698697 +0000 UTC m=+1099.024223931" observedRunningTime="2025-12-10 12:10:21.678585901 +0000 UTC m=+1107.764111115" watchObservedRunningTime="2025-12-10 12:10:21.685873313 +0000 UTC m=+1107.771398527" Dec 10 12:10:21 crc kubenswrapper[4852]: I1210 12:10:21.724302 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-q26ll" podStartSLOduration=2.8547426849999997 podStartE2EDuration="1m0.724284429s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="2025-12-10 12:09:22.984032394 +0000 UTC m=+1049.069557628" lastFinishedPulling="2025-12-10 12:10:20.853574158 +0000 UTC m=+1106.939099372" observedRunningTime="2025-12-10 12:10:21.722060164 +0000 UTC m=+1107.807585408" watchObservedRunningTime="2025-12-10 12:10:21.724284429 +0000 UTC m=+1107.809809653" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.493620 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" event={"ID":"3fc3907c-5313-44d8-90dd-155b24156a1b","Type":"ContainerStarted","Data":"97b68eedf34517888944f5a8ec5158f322ca42988e870b2804871df0942f1577"} Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.493963 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.496559 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" event={"ID":"d01d86ae-5138-4298-8ec0-7aa8cdd468fe","Type":"ContainerStarted","Data":"eefee89af1e207b612cbf849eef874d61a9cbec0c339db13ce1706c8cf1307e4"} Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.496635 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.499240 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" event={"ID":"97d20a41-52e0-47d5-86fd-0f486080ebf5","Type":"ContainerStarted","Data":"bd336b9e6c03245ca1fdc037cc32e3c5b64b18475e0128596a8e82ba0a55a35b"} Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.499366 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.501575 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" event={"ID":"f67c3362-3da1-45f6-8fc6-47e16b206173","Type":"ContainerStarted","Data":"a491bf9cd2b29720d7f63b0634f8ee2690b253cbd4ac7409456bfb0e96227404"} Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.501674 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.503376 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" event={"ID":"3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2","Type":"ContainerStarted","Data":"8de6226c62194af2c63bffc174ed9cabaca3c98240f9d3c8bbbecf87fb4165ea"} Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.558337 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" podStartSLOduration=46.331914097 podStartE2EDuration="1m2.558298836s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:10:02.029960387 +0000 UTC m=+1088.115485611" lastFinishedPulling="2025-12-10 12:10:18.256345126 +0000 UTC m=+1104.341870350" observedRunningTime="2025-12-10 12:10:22.539102798 +0000 UTC m=+1108.624628022" watchObservedRunningTime="2025-12-10 12:10:22.558298836 +0000 UTC m=+1108.643824070" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.592705 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" podStartSLOduration=7.256428988 podStartE2EDuration="1m1.592669632s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.028092921 +0000 UTC m=+1049.113618145" lastFinishedPulling="2025-12-10 12:10:17.364333555 +0000 UTC m=+1103.449858789" observedRunningTime="2025-12-10 12:10:22.578046888 +0000 UTC m=+1108.663572112" watchObservedRunningTime="2025-12-10 12:10:22.592669632 +0000 UTC m=+1108.678194856" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.605889 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" podStartSLOduration=5.267593935 podStartE2EDuration="1m2.605870661s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.029492556 +0000 UTC m=+1049.115017780" lastFinishedPulling="2025-12-10 12:10:20.367769282 +0000 UTC m=+1106.453294506" observedRunningTime="2025-12-10 12:10:22.602693292 +0000 UTC m=+1108.688218516" watchObservedRunningTime="2025-12-10 12:10:22.605870661 +0000 UTC m=+1108.691395885" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.634878 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" podStartSLOduration=6.90713455 podStartE2EDuration="1m2.634852243s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:22.526951482 +0000 UTC m=+1048.612476706" lastFinishedPulling="2025-12-10 12:10:18.254669175 +0000 UTC m=+1104.340194399" observedRunningTime="2025-12-10 12:10:22.626544096 +0000 UTC m=+1108.712069320" watchObservedRunningTime="2025-12-10 12:10:22.634852243 +0000 UTC m=+1108.720377467" Dec 10 12:10:22 crc kubenswrapper[4852]: I1210 12:10:22.658708 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" podStartSLOduration=44.072559678 podStartE2EDuration="1m1.658688466s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="2025-12-10 12:10:00.220215853 +0000 UTC m=+1086.305741087" lastFinishedPulling="2025-12-10 12:10:17.806344651 +0000 UTC m=+1103.891869875" observedRunningTime="2025-12-10 12:10:22.651849506 +0000 UTC m=+1108.737374750" watchObservedRunningTime="2025-12-10 12:10:22.658688466 +0000 UTC m=+1108.744213690" Dec 10 12:10:23 crc kubenswrapper[4852]: I1210 12:10:23.512149 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" event={"ID":"391832bd-03d9-409e-93a0-b8986ed437ff","Type":"ContainerStarted","Data":"3ae6590bfb9d8514d5f00d1604603d80d94ed56f2700fb17a5ff6be7d7322243"} Dec 10 12:10:23 crc kubenswrapper[4852]: I1210 12:10:23.513705 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:10:23 crc kubenswrapper[4852]: I1210 12:10:23.538578 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" podStartSLOduration=3.588304819 podStartE2EDuration="1m3.538557985s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:22.517411524 +0000 UTC m=+1048.602936758" lastFinishedPulling="2025-12-10 12:10:22.4676647 +0000 UTC m=+1108.553189924" observedRunningTime="2025-12-10 12:10:23.532218808 +0000 UTC m=+1109.617744022" watchObservedRunningTime="2025-12-10 12:10:23.538557985 +0000 UTC m=+1109.624083209" Dec 10 12:10:24 crc kubenswrapper[4852]: E1210 12:10:24.472780 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 10 12:10:24 crc kubenswrapper[4852]: E1210 12:10:24.473260 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28z48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-nlcxk_openstack-operators(f53525dc-0dc9-44c5-a947-2e303cb0ed1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:10:24 crc kubenswrapper[4852]: E1210 12:10:24.474419 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" podUID="f53525dc-0dc9-44c5-a947-2e303cb0ed1c" Dec 10 12:10:24 crc kubenswrapper[4852]: I1210 12:10:24.519936 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" event={"ID":"88a0620c-81a0-4ad1-ae9a-13eb0d08e10f","Type":"ContainerStarted","Data":"a4a9bbd3c9dfc8832c2d689a5641a6371aeb14b32f2bdd6e354ff96d6e672b4d"} Dec 10 12:10:24 crc kubenswrapper[4852]: I1210 12:10:24.520456 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" Dec 10 12:10:24 crc kubenswrapper[4852]: I1210 12:10:24.520588 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" Dec 10 12:10:24 crc kubenswrapper[4852]: I1210 12:10:24.522208 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-j8h26" Dec 10 12:10:24 crc kubenswrapper[4852]: I1210 12:10:24.523345 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" Dec 10 12:10:24 crc kubenswrapper[4852]: I1210 12:10:24.558118 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" podStartSLOduration=3.91964633 podStartE2EDuration="1m4.558074261s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:22.406047821 +0000 UTC m=+1048.491573045" lastFinishedPulling="2025-12-10 12:10:23.044475752 +0000 UTC m=+1109.130000976" observedRunningTime="2025-12-10 12:10:24.557963808 +0000 UTC m=+1110.643489032" watchObservedRunningTime="2025-12-10 12:10:24.558074261 +0000 UTC m=+1110.643599475" Dec 10 12:10:25 crc kubenswrapper[4852]: I1210 12:10:25.528525 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" event={"ID":"f53525dc-0dc9-44c5-a947-2e303cb0ed1c","Type":"ContainerStarted","Data":"9d3ed2ca5fb6c4c849c6817b520deb51ce6d5329a5293fdb23dddde13fcd5d6d"} Dec 10 12:10:25 crc kubenswrapper[4852]: I1210 12:10:25.528985 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" Dec 10 12:10:25 crc kubenswrapper[4852]: I1210 12:10:25.531641 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bx58k" Dec 10 12:10:25 crc kubenswrapper[4852]: I1210 12:10:25.555461 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nlcxk" podStartSLOduration=33.678877342 podStartE2EDuration="1m5.555438426s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.028363157 +0000 UTC m=+1049.113888381" lastFinishedPulling="2025-12-10 12:09:54.904924241 +0000 UTC m=+1080.990449465" observedRunningTime="2025-12-10 12:10:25.551915668 +0000 UTC m=+1111.637440912" watchObservedRunningTime="2025-12-10 12:10:25.555438426 +0000 UTC m=+1111.640963680" Dec 10 12:10:27 crc kubenswrapper[4852]: I1210 12:10:27.270372 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f8csbt" Dec 10 12:10:27 crc kubenswrapper[4852]: E1210 12:10:27.299512 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" podUID="785eda15-0a5d-451d-8ec4-b35e1f8d8147" Dec 10 12:10:27 crc kubenswrapper[4852]: I1210 12:10:27.547190 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" event={"ID":"785eda15-0a5d-451d-8ec4-b35e1f8d8147","Type":"ContainerStarted","Data":"f3e61d9aa1138bb1e74eeac64e27ace6cc957c82d33b1bda221dafd0a505efba"} Dec 10 12:10:28 crc kubenswrapper[4852]: I1210 12:10:28.557665 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" event={"ID":"74cd0e4c-bd25-4b22-8b1f-cb3758f446fd","Type":"ContainerStarted","Data":"9236f1625c5c543a75f216e86bb12dc055ad10741547fc4e1558619709e45cb1"} Dec 10 12:10:28 crc kubenswrapper[4852]: I1210 12:10:28.558101 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" Dec 10 12:10:28 crc kubenswrapper[4852]: I1210 12:10:28.560250 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" Dec 10 12:10:28 crc kubenswrapper[4852]: I1210 12:10:28.574537 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-tlflj" podStartSLOduration=3.648905248 podStartE2EDuration="1m8.574510041s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:22.54817394 +0000 UTC m=+1048.633699164" lastFinishedPulling="2025-12-10 12:10:27.473778733 +0000 UTC m=+1113.559303957" observedRunningTime="2025-12-10 12:10:28.572727097 +0000 UTC m=+1114.658252331" watchObservedRunningTime="2025-12-10 12:10:28.574510041 +0000 UTC m=+1114.660035265" Dec 10 12:10:28 crc kubenswrapper[4852]: E1210 12:10:28.670580 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" podUID="79986568-4439-4f2a-9dc4-af5fb1a1d787" Dec 10 12:10:29 crc kubenswrapper[4852]: I1210 12:10:29.566766 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" event={"ID":"62e793a9-5b13-4532-90fe-d3313b3cf4d9","Type":"ContainerStarted","Data":"bc065f321428fc405e3969b09272a631ca53236994d65ab98c75440aef1ac27e"} Dec 10 12:10:29 crc kubenswrapper[4852]: I1210 12:10:29.568146 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" Dec 10 12:10:29 crc kubenswrapper[4852]: I1210 12:10:29.569172 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" event={"ID":"79986568-4439-4f2a-9dc4-af5fb1a1d787","Type":"ContainerStarted","Data":"9a8be687f3940b9a3bd3fc6639d6580a72101c3ea85c6fc3a6d98c69c99271d5"} Dec 10 12:10:29 crc kubenswrapper[4852]: I1210 12:10:29.571867 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" Dec 10 12:10:29 crc kubenswrapper[4852]: I1210 12:10:29.590403 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rhwzx" podStartSLOduration=3.748991021 podStartE2EDuration="1m9.590383427s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:22.561246996 +0000 UTC m=+1048.646772220" lastFinishedPulling="2025-12-10 12:10:28.402639412 +0000 UTC m=+1114.488164626" observedRunningTime="2025-12-10 12:10:29.589050834 +0000 UTC m=+1115.674576058" watchObservedRunningTime="2025-12-10 12:10:29.590383427 +0000 UTC m=+1115.675908661" Dec 10 12:10:31 crc kubenswrapper[4852]: I1210 12:10:31.086850 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-pnppk" Dec 10 12:10:31 crc kubenswrapper[4852]: I1210 12:10:31.295990 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4kkrb" Dec 10 12:10:31 crc kubenswrapper[4852]: I1210 12:10:31.495889 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-54gtc" Dec 10 12:10:31 crc kubenswrapper[4852]: I1210 12:10:31.911136 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-57gb2" Dec 10 12:10:33 crc kubenswrapper[4852]: I1210 12:10:33.011315 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5p988" Dec 10 12:10:33 crc kubenswrapper[4852]: E1210 12:10:33.831449 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" podUID="bf62d827-9a6d-4a53-9a65-b287195f3bea" Dec 10 12:10:33 crc kubenswrapper[4852]: E1210 12:10:33.921534 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:4fa131a1b726b2d6468d461e7d8867a2157d5671f712461d8abd126155fdf9ce: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:4fa131a1b726b2d6468d461e7d8867a2157d5671f712461d8abd126155fdf9ce\": context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 10 12:10:33 crc kubenswrapper[4852]: E1210 12:10:33.921893 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvvc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-xr9c5_openstack-operators(2a55ad46-c35b-4429-b1da-7a361f7c45d0): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:4fa131a1b726b2d6468d461e7d8867a2157d5671f712461d8abd126155fdf9ce: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:4fa131a1b726b2d6468d461e7d8867a2157d5671f712461d8abd126155fdf9ce\": context canceled" logger="UnhandledError" Dec 10 12:10:33 crc kubenswrapper[4852]: E1210 12:10:33.923220 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:4fa131a1b726b2d6468d461e7d8867a2157d5671f712461d8abd126155fdf9ce: Get \\\"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:4fa131a1b726b2d6468d461e7d8867a2157d5671f712461d8abd126155fdf9ce\\\": context canceled\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" podUID="2a55ad46-c35b-4429-b1da-7a361f7c45d0" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.605654 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" event={"ID":"6b9c74bb-9c09-4976-be53-8b2c296f7788","Type":"ContainerStarted","Data":"3a456bf74888c547061a8228e047f08a6c2576cf58873806a530f7c1b311583b"} Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.606086 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.607938 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" event={"ID":"bf62d827-9a6d-4a53-9a65-b287195f3bea","Type":"ContainerStarted","Data":"22d5eca38fc6015f4658edc9726f16af7fc425fabd05258349ab3b00e797840c"} Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.608545 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.610501 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" event={"ID":"9c39ec89-c5bf-4cdd-a253-154db7bcf781","Type":"ContainerStarted","Data":"8a232f75e7568401db629dc15b63a6fd0f09e684353333d7d2dac91e1ca1cf50"} Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.610897 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.614188 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" event={"ID":"67ab896e-72eb-4040-9397-2a2bcca37c7e","Type":"ContainerStarted","Data":"e7e658533e97bd1272863c95517536769b2c5f77f9488c428e81397685734f19"} Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.615259 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.617414 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.618124 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.619081 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" event={"ID":"c7a73ae7-6060-497a-b94f-8988c2244f94","Type":"ContainerStarted","Data":"b6566b4f1ecf51261bc4a08d4588088eb4e08a7f2099dab9b399c2a7ba5e97d5"} Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.619193 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" event={"ID":"c7a73ae7-6060-497a-b94f-8988c2244f94","Type":"ContainerStarted","Data":"370be33e3021d966bde706fd2ce5b946ea9829d8b3fb9f0823c11f46eb0529c5"} Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.619450 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.621003 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" event={"ID":"785eda15-0a5d-451d-8ec4-b35e1f8d8147","Type":"ContainerStarted","Data":"e34af7749edff2ec70ebb55db6235b331c891806e3349ac4d518b8b6c2f00de9"} Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.621213 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.621705 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zhxkc" podStartSLOduration=2.77703659 podStartE2EDuration="1m13.621687608s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.040219043 +0000 UTC m=+1049.125744267" lastFinishedPulling="2025-12-10 12:10:33.884870061 +0000 UTC m=+1119.970395285" observedRunningTime="2025-12-10 12:10:34.618797036 +0000 UTC m=+1120.704322260" watchObservedRunningTime="2025-12-10 12:10:34.621687608 +0000 UTC m=+1120.707212832" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.639926 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-b2jw2" podStartSLOduration=8.772850877 podStartE2EDuration="1m14.639908432s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:22.798873693 +0000 UTC m=+1048.884398917" lastFinishedPulling="2025-12-10 12:10:28.665931248 +0000 UTC m=+1114.751456472" observedRunningTime="2025-12-10 12:10:34.637123863 +0000 UTC m=+1120.722649087" watchObservedRunningTime="2025-12-10 12:10:34.639908432 +0000 UTC m=+1120.725433656" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.728614 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-gbgfx" podStartSLOduration=9.044889512 podStartE2EDuration="1m14.728598631s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.068944328 +0000 UTC m=+1049.154469542" lastFinishedPulling="2025-12-10 12:10:28.752653437 +0000 UTC m=+1114.838178661" observedRunningTime="2025-12-10 12:10:34.724923449 +0000 UTC m=+1120.810448693" watchObservedRunningTime="2025-12-10 12:10:34.728598631 +0000 UTC m=+1120.814123855" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.746476 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" podStartSLOduration=8.03961015 podStartE2EDuration="1m13.746447075s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.058943929 +0000 UTC m=+1049.144469153" lastFinishedPulling="2025-12-10 12:10:28.765780844 +0000 UTC m=+1114.851306078" observedRunningTime="2025-12-10 12:10:34.743893041 +0000 UTC m=+1120.829418265" watchObservedRunningTime="2025-12-10 12:10:34.746447075 +0000 UTC m=+1120.831972309" Dec 10 12:10:34 crc kubenswrapper[4852]: I1210 12:10:34.766129 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" podStartSLOduration=8.324151176 podStartE2EDuration="1m13.766110465s" podCreationTimestamp="2025-12-10 12:09:21 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.245080854 +0000 UTC m=+1049.330606078" lastFinishedPulling="2025-12-10 12:10:28.687040143 +0000 UTC m=+1114.772565367" observedRunningTime="2025-12-10 12:10:34.757773167 +0000 UTC m=+1120.843298401" watchObservedRunningTime="2025-12-10 12:10:34.766110465 +0000 UTC m=+1120.851635689" Dec 10 12:10:35 crc kubenswrapper[4852]: I1210 12:10:35.628596 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" event={"ID":"79986568-4439-4f2a-9dc4-af5fb1a1d787","Type":"ContainerStarted","Data":"bdcd71c31f6ccaf36f955b4d4792ab5560e9fd8254da1e8a6da892ae8a1d6efe"} Dec 10 12:10:35 crc kubenswrapper[4852]: I1210 12:10:35.628890 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" Dec 10 12:10:35 crc kubenswrapper[4852]: I1210 12:10:35.630714 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" event={"ID":"bf62d827-9a6d-4a53-9a65-b287195f3bea","Type":"ContainerStarted","Data":"5e4dfaf0bf341b07b60b350d969575d0c098d52e6dc230519b2601c6e3609de6"} Dec 10 12:10:35 crc kubenswrapper[4852]: I1210 12:10:35.631129 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" Dec 10 12:10:35 crc kubenswrapper[4852]: I1210 12:10:35.646630 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" podStartSLOduration=4.299794026 podStartE2EDuration="1m15.646614819s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.029197928 +0000 UTC m=+1049.114723152" lastFinishedPulling="2025-12-10 12:10:34.376018721 +0000 UTC m=+1120.461543945" observedRunningTime="2025-12-10 12:10:35.645184643 +0000 UTC m=+1121.730709877" watchObservedRunningTime="2025-12-10 12:10:35.646614819 +0000 UTC m=+1121.732140043" Dec 10 12:10:35 crc kubenswrapper[4852]: I1210 12:10:35.664942 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" podStartSLOduration=3.325022704 podStartE2EDuration="1m15.664921254s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.060432776 +0000 UTC m=+1049.145958000" lastFinishedPulling="2025-12-10 12:10:35.400331326 +0000 UTC m=+1121.485856550" observedRunningTime="2025-12-10 12:10:35.65870384 +0000 UTC m=+1121.744229074" watchObservedRunningTime="2025-12-10 12:10:35.664921254 +0000 UTC m=+1121.750446478" Dec 10 12:10:41 crc kubenswrapper[4852]: I1210 12:10:41.400798 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pzw5d" Dec 10 12:10:41 crc kubenswrapper[4852]: I1210 12:10:41.431427 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p22mj" Dec 10 12:10:41 crc kubenswrapper[4852]: I1210 12:10:41.626308 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-kc8c8" Dec 10 12:10:41 crc kubenswrapper[4852]: I1210 12:10:41.772586 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-lhzps" Dec 10 12:10:47 crc kubenswrapper[4852]: I1210 12:10:47.719660 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" event={"ID":"2a55ad46-c35b-4429-b1da-7a361f7c45d0","Type":"ContainerStarted","Data":"faf659c041450ff72a868d1010fec9a63c8a061806f805beedd70997eecad0f0"} Dec 10 12:10:47 crc kubenswrapper[4852]: I1210 12:10:47.720176 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" event={"ID":"2a55ad46-c35b-4429-b1da-7a361f7c45d0","Type":"ContainerStarted","Data":"ee63ddc346bfc01123783a17c1d9ea4727721789b472073613151bc4219ab6da"} Dec 10 12:10:47 crc kubenswrapper[4852]: I1210 12:10:47.720334 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" Dec 10 12:10:47 crc kubenswrapper[4852]: I1210 12:10:47.739827 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" podStartSLOduration=4.040916631 podStartE2EDuration="1m27.739807053s" podCreationTimestamp="2025-12-10 12:09:20 +0000 UTC" firstStartedPulling="2025-12-10 12:09:23.068646921 +0000 UTC m=+1049.154172145" lastFinishedPulling="2025-12-10 12:10:46.767537353 +0000 UTC m=+1132.853062567" observedRunningTime="2025-12-10 12:10:47.735140777 +0000 UTC m=+1133.820666001" watchObservedRunningTime="2025-12-10 12:10:47.739807053 +0000 UTC m=+1133.825332277" Dec 10 12:10:51 crc kubenswrapper[4852]: I1210 12:10:51.452087 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xr9c5" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.137133 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jthws"] Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.140329 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.142500 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-76ms5" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.146456 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.146657 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.146777 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.164787 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jthws"] Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.204748 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jkdn9"] Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.206249 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.209012 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.219721 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jkdn9"] Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.231195 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svpn\" (UniqueName: \"kubernetes.io/projected/ab983f60-c76c-44f2-bd26-99206a0d42b7-kube-api-access-4svpn\") pod \"dnsmasq-dns-675f4bcbfc-jthws\" (UID: \"ab983f60-c76c-44f2-bd26-99206a0d42b7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.231254 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab983f60-c76c-44f2-bd26-99206a0d42b7-config\") pod \"dnsmasq-dns-675f4bcbfc-jthws\" (UID: \"ab983f60-c76c-44f2-bd26-99206a0d42b7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.332809 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab983f60-c76c-44f2-bd26-99206a0d42b7-config\") pod \"dnsmasq-dns-675f4bcbfc-jthws\" (UID: \"ab983f60-c76c-44f2-bd26-99206a0d42b7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.332904 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjd6r\" (UniqueName: \"kubernetes.io/projected/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-kube-api-access-bjd6r\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.332945 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.332978 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4svpn\" (UniqueName: \"kubernetes.io/projected/ab983f60-c76c-44f2-bd26-99206a0d42b7-kube-api-access-4svpn\") pod \"dnsmasq-dns-675f4bcbfc-jthws\" (UID: \"ab983f60-c76c-44f2-bd26-99206a0d42b7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.332998 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-config\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.333892 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab983f60-c76c-44f2-bd26-99206a0d42b7-config\") pod \"dnsmasq-dns-675f4bcbfc-jthws\" (UID: \"ab983f60-c76c-44f2-bd26-99206a0d42b7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.355364 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4svpn\" (UniqueName: \"kubernetes.io/projected/ab983f60-c76c-44f2-bd26-99206a0d42b7-kube-api-access-4svpn\") pod \"dnsmasq-dns-675f4bcbfc-jthws\" (UID: \"ab983f60-c76c-44f2-bd26-99206a0d42b7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.434520 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-config\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.434665 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjd6r\" (UniqueName: \"kubernetes.io/projected/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-kube-api-access-bjd6r\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.434713 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.435771 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-config\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.435832 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.451276 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjd6r\" (UniqueName: \"kubernetes.io/projected/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-kube-api-access-bjd6r\") pod \"dnsmasq-dns-78dd6ddcc-jkdn9\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.475181 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.531933 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.799704 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jkdn9"] Dec 10 12:11:08 crc kubenswrapper[4852]: W1210 12:11:08.805254 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba40c379_ddd2_43fc_8ed6_db6b12a1efe8.slice/crio-2555b407b5147e17bec5873c0420f27989583873e42a9b98b24baeea76797225 WatchSource:0}: Error finding container 2555b407b5147e17bec5873c0420f27989583873e42a9b98b24baeea76797225: Status 404 returned error can't find the container with id 2555b407b5147e17bec5873c0420f27989583873e42a9b98b24baeea76797225 Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.886867 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" event={"ID":"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8","Type":"ContainerStarted","Data":"2555b407b5147e17bec5873c0420f27989583873e42a9b98b24baeea76797225"} Dec 10 12:11:08 crc kubenswrapper[4852]: I1210 12:11:08.932320 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jthws"] Dec 10 12:11:09 crc kubenswrapper[4852]: I1210 12:11:09.897821 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" event={"ID":"ab983f60-c76c-44f2-bd26-99206a0d42b7","Type":"ContainerStarted","Data":"c89466ebdbeccaecc0d06b1bb29dba81fdc747e3cd1628d85fc1fb798498ffb2"} Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.301612 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jthws"] Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.327572 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5zwrg"] Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.329070 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.340137 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5zwrg"] Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.486142 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.491659 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpf6b\" (UniqueName: \"kubernetes.io/projected/1d872505-242e-4adc-acdd-756183702aba-kube-api-access-dpf6b\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.491794 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-config\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.593680 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-config\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.593776 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.593809 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpf6b\" (UniqueName: \"kubernetes.io/projected/1d872505-242e-4adc-acdd-756183702aba-kube-api-access-dpf6b\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.594860 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.594864 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-config\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.606112 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jkdn9"] Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.622884 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpf6b\" (UniqueName: \"kubernetes.io/projected/1d872505-242e-4adc-acdd-756183702aba-kube-api-access-dpf6b\") pod \"dnsmasq-dns-666b6646f7-5zwrg\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.653470 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-srnc7"] Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.654678 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.671025 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.677586 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-srnc7"] Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.796470 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xslq\" (UniqueName: \"kubernetes.io/projected/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-kube-api-access-4xslq\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.796551 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.796614 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-config\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.898442 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xslq\" (UniqueName: \"kubernetes.io/projected/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-kube-api-access-4xslq\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.898537 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.898576 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-config\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.899689 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.899745 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-config\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.919182 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xslq\" (UniqueName: \"kubernetes.io/projected/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-kube-api-access-4xslq\") pod \"dnsmasq-dns-57d769cc4f-srnc7\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:11 crc kubenswrapper[4852]: I1210 12:11:11.992718 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.444668 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.446167 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.447716 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.448619 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.449516 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.449595 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.449766 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-c8894" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.449970 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.450821 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.460672 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520652 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520701 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520727 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520752 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09a9edae-3cd0-4f71-ba18-9800a7baefef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520771 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-config-data\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520788 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09a9edae-3cd0-4f71-ba18-9800a7baefef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520808 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520825 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520844 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520857 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdv5\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-kube-api-access-tmdv5\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.520873 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622267 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622392 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622427 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622460 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09a9edae-3cd0-4f71-ba18-9800a7baefef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622485 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-config-data\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622511 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09a9edae-3cd0-4f71-ba18-9800a7baefef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622540 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622564 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622594 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622613 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdv5\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-kube-api-access-tmdv5\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.622636 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.623097 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.623121 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.623184 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.624199 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.624472 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-config-data\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.625903 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.626792 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09a9edae-3cd0-4f71-ba18-9800a7baefef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.627581 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.640579 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.641347 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdv5\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-kube-api-access-tmdv5\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.641935 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09a9edae-3cd0-4f71-ba18-9800a7baefef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.650502 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.750279 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.751949 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.754391 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9gkv5" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.755068 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.755253 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.755307 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.755377 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.755464 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.756968 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.773511 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.778443 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.824796 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdlxh\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-kube-api-access-kdlxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.824860 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.824903 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.824933 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.824954 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.824973 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.825005 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15a1ed1e-209b-4c71-b15f-44caaec70e93-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.825123 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.825352 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.825387 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15a1ed1e-209b-4c71-b15f-44caaec70e93-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.825546 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927370 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927457 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdlxh\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-kube-api-access-kdlxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927491 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927525 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927553 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927577 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927597 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927628 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15a1ed1e-209b-4c71-b15f-44caaec70e93-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927656 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927705 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.927727 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15a1ed1e-209b-4c71-b15f-44caaec70e93-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.928341 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.928829 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.928862 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.929022 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.929810 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.929968 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.931865 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15a1ed1e-209b-4c71-b15f-44caaec70e93-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.936316 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15a1ed1e-209b-4c71-b15f-44caaec70e93-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.942134 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.945724 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdlxh\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-kube-api-access-kdlxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.947170 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:12 crc kubenswrapper[4852]: I1210 12:11:12.947551 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:13 crc kubenswrapper[4852]: I1210 12:11:13.081048 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.383893 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.385489 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.389024 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.390362 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.390689 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.390951 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ppv4m" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.400443 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.430163 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.450704 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d466b79-84c0-42e9-8952-8491b4ced74e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.450761 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxk2\" (UniqueName: \"kubernetes.io/projected/2d466b79-84c0-42e9-8952-8491b4ced74e-kube-api-access-wlxk2\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.450806 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-config-data-default\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.450835 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d466b79-84c0-42e9-8952-8491b4ced74e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.450854 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.450875 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.450947 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-kolla-config\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.450983 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d466b79-84c0-42e9-8952-8491b4ced74e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.552518 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxk2\" (UniqueName: \"kubernetes.io/projected/2d466b79-84c0-42e9-8952-8491b4ced74e-kube-api-access-wlxk2\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.552902 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-config-data-default\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.554213 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d466b79-84c0-42e9-8952-8491b4ced74e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.554390 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.554521 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.554680 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-kolla-config\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.554803 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d466b79-84c0-42e9-8952-8491b4ced74e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.554961 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d466b79-84c0-42e9-8952-8491b4ced74e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.555539 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2d466b79-84c0-42e9-8952-8491b4ced74e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.555886 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.554134 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-config-data-default\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.556517 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.557308 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2d466b79-84c0-42e9-8952-8491b4ced74e-kolla-config\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.560959 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d466b79-84c0-42e9-8952-8491b4ced74e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.561184 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d466b79-84c0-42e9-8952-8491b4ced74e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.570776 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxk2\" (UniqueName: \"kubernetes.io/projected/2d466b79-84c0-42e9-8952-8491b4ced74e-kube-api-access-wlxk2\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.582319 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"2d466b79-84c0-42e9-8952-8491b4ced74e\") " pod="openstack/openstack-galera-0" Dec 10 12:11:14 crc kubenswrapper[4852]: I1210 12:11:14.715134 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.766467 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.767749 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.769546 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.770348 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-d2hfr" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.770484 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.770506 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.784288 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.873254 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.873305 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.873350 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.873457 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.873495 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz87h\" (UniqueName: \"kubernetes.io/projected/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-kube-api-access-xz87h\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.873568 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.873641 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.873673 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.975863 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.975932 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz87h\" (UniqueName: \"kubernetes.io/projected/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-kube-api-access-xz87h\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.975974 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.976015 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.976040 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.976171 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.976206 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.976252 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.976536 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.976745 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.976979 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.977972 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.978841 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.980314 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.980860 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.998512 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz87h\" (UniqueName: \"kubernetes.io/projected/06dd4615-ecfb-4e00-9dcf-ee18317d1f95-kube-api-access-xz87h\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:15 crc kubenswrapper[4852]: I1210 12:11:15.998704 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"06dd4615-ecfb-4e00-9dcf-ee18317d1f95\") " pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.056098 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.057156 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.058953 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qvfn8" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.059405 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.059794 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.073300 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.077953 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.077996 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-kolla-config\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.078026 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-config-data\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.078049 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.078109 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wddcm\" (UniqueName: \"kubernetes.io/projected/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-kube-api-access-wddcm\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.128863 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.179677 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wddcm\" (UniqueName: \"kubernetes.io/projected/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-kube-api-access-wddcm\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.179768 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.179794 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-kolla-config\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.179829 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-config-data\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.179855 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.180675 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-kolla-config\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.180736 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-config-data\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.185795 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.187460 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.196744 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wddcm\" (UniqueName: \"kubernetes.io/projected/7a324e51-4ea8-4cca-8cfd-6f64d13cd706-kube-api-access-wddcm\") pod \"memcached-0\" (UID: \"7a324e51-4ea8-4cca-8cfd-6f64d13cd706\") " pod="openstack/memcached-0" Dec 10 12:11:16 crc kubenswrapper[4852]: I1210 12:11:16.372052 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 10 12:11:17 crc kubenswrapper[4852]: I1210 12:11:17.687105 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:11:17 crc kubenswrapper[4852]: I1210 12:11:17.688372 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 10 12:11:17 crc kubenswrapper[4852]: I1210 12:11:17.690064 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rh2qj" Dec 10 12:11:17 crc kubenswrapper[4852]: I1210 12:11:17.709285 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:11:17 crc kubenswrapper[4852]: I1210 12:11:17.803108 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctdwr\" (UniqueName: \"kubernetes.io/projected/681949fa-a426-400e-8f81-475a0555dc08-kube-api-access-ctdwr\") pod \"kube-state-metrics-0\" (UID: \"681949fa-a426-400e-8f81-475a0555dc08\") " pod="openstack/kube-state-metrics-0" Dec 10 12:11:17 crc kubenswrapper[4852]: I1210 12:11:17.904738 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctdwr\" (UniqueName: \"kubernetes.io/projected/681949fa-a426-400e-8f81-475a0555dc08-kube-api-access-ctdwr\") pod \"kube-state-metrics-0\" (UID: \"681949fa-a426-400e-8f81-475a0555dc08\") " pod="openstack/kube-state-metrics-0" Dec 10 12:11:17 crc kubenswrapper[4852]: I1210 12:11:17.921225 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctdwr\" (UniqueName: \"kubernetes.io/projected/681949fa-a426-400e-8f81-475a0555dc08-kube-api-access-ctdwr\") pod \"kube-state-metrics-0\" (UID: \"681949fa-a426-400e-8f81-475a0555dc08\") " pod="openstack/kube-state-metrics-0" Dec 10 12:11:18 crc kubenswrapper[4852]: I1210 12:11:18.004859 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.468965 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gkhdx"] Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.470473 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.483758 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6qdnb" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.484033 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.484061 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.487181 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gkhdx"] Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.511391 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qd68p"] Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.513696 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.529668 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qd68p"] Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555297 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-run-ovn\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555345 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6246b317-7d73-49ff-bd8e-f4862a4584c6-scripts\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555371 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-lib\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555501 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-log-ovn\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555538 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6246b317-7d73-49ff-bd8e-f4862a4584c6-ovn-controller-tls-certs\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555555 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6246b317-7d73-49ff-bd8e-f4862a4584c6-combined-ca-bundle\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555591 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqn8l\" (UniqueName: \"kubernetes.io/projected/6246b317-7d73-49ff-bd8e-f4862a4584c6-kube-api-access-dqn8l\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555607 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-run\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555696 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4670741-ddce-45cb-aa16-8f7c419f0c89-scripts\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555717 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-etc-ovs\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555745 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-log\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555767 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-run\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.555805 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2z4\" (UniqueName: \"kubernetes.io/projected/b4670741-ddce-45cb-aa16-8f7c419f0c89-kube-api-access-9k2z4\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657075 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4670741-ddce-45cb-aa16-8f7c419f0c89-scripts\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657118 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-etc-ovs\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657164 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-log\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657195 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-run\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657265 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2z4\" (UniqueName: \"kubernetes.io/projected/b4670741-ddce-45cb-aa16-8f7c419f0c89-kube-api-access-9k2z4\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657314 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-run-ovn\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657343 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6246b317-7d73-49ff-bd8e-f4862a4584c6-scripts\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657365 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-lib\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657390 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-log-ovn\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657411 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6246b317-7d73-49ff-bd8e-f4862a4584c6-ovn-controller-tls-certs\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657429 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6246b317-7d73-49ff-bd8e-f4862a4584c6-combined-ca-bundle\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657453 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqn8l\" (UniqueName: \"kubernetes.io/projected/6246b317-7d73-49ff-bd8e-f4862a4584c6-kube-api-access-dqn8l\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657471 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-run\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657649 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-etc-ovs\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.657943 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-log-ovn\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.658071 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-run\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.658092 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-run\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.658206 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6246b317-7d73-49ff-bd8e-f4862a4584c6-var-run-ovn\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.658207 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-lib\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.658273 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b4670741-ddce-45cb-aa16-8f7c419f0c89-var-log\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.660207 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4670741-ddce-45cb-aa16-8f7c419f0c89-scripts\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.660455 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6246b317-7d73-49ff-bd8e-f4862a4584c6-scripts\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.664407 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6246b317-7d73-49ff-bd8e-f4862a4584c6-combined-ca-bundle\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.674191 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6246b317-7d73-49ff-bd8e-f4862a4584c6-ovn-controller-tls-certs\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.678596 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2z4\" (UniqueName: \"kubernetes.io/projected/b4670741-ddce-45cb-aa16-8f7c419f0c89-kube-api-access-9k2z4\") pod \"ovn-controller-ovs-qd68p\" (UID: \"b4670741-ddce-45cb-aa16-8f7c419f0c89\") " pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.687880 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqn8l\" (UniqueName: \"kubernetes.io/projected/6246b317-7d73-49ff-bd8e-f4862a4584c6-kube-api-access-dqn8l\") pod \"ovn-controller-gkhdx\" (UID: \"6246b317-7d73-49ff-bd8e-f4862a4584c6\") " pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.807617 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:21 crc kubenswrapper[4852]: I1210 12:11:21.835588 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.365917 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.368758 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.371456 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.371514 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.371674 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dnx7s" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.371768 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.373612 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.376488 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.473544 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.473671 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.473771 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.473857 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.473883 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.473968 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.474004 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.474028 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqnf8\" (UniqueName: \"kubernetes.io/projected/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-kube-api-access-zqnf8\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575465 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575537 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575597 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575652 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575683 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575747 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575778 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575846 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqnf8\" (UniqueName: \"kubernetes.io/projected/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-kube-api-access-zqnf8\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.575987 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.576208 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.577206 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.577309 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.580838 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.581814 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.586201 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.591323 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqnf8\" (UniqueName: \"kubernetes.io/projected/3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0-kube-api-access-zqnf8\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.601813 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0\") " pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:22 crc kubenswrapper[4852]: I1210 12:11:22.696093 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.450529 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.452166 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.455051 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.455196 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qg5tb" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.455872 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.456043 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.460795 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.560076 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.560131 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.560160 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.560194 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjht4\" (UniqueName: \"kubernetes.io/projected/b16645b8-8fa6-46cc-848a-2815e736e9b2-kube-api-access-sjht4\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.560291 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b16645b8-8fa6-46cc-848a-2815e736e9b2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.560339 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b16645b8-8fa6-46cc-848a-2815e736e9b2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.560397 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16645b8-8fa6-46cc-848a-2815e736e9b2-config\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.560421 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.661209 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.661471 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.661590 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.661697 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjht4\" (UniqueName: \"kubernetes.io/projected/b16645b8-8fa6-46cc-848a-2815e736e9b2-kube-api-access-sjht4\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.661777 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b16645b8-8fa6-46cc-848a-2815e736e9b2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.661866 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b16645b8-8fa6-46cc-848a-2815e736e9b2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.661971 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16645b8-8fa6-46cc-848a-2815e736e9b2-config\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.662048 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.661550 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.662521 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b16645b8-8fa6-46cc-848a-2815e736e9b2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.662919 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16645b8-8fa6-46cc-848a-2815e736e9b2-config\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.663177 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b16645b8-8fa6-46cc-848a-2815e736e9b2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.666819 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.668853 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.670132 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16645b8-8fa6-46cc-848a-2815e736e9b2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.684627 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.687389 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjht4\" (UniqueName: \"kubernetes.io/projected/b16645b8-8fa6-46cc-848a-2815e736e9b2-kube-api-access-sjht4\") pod \"ovsdbserver-sb-0\" (UID: \"b16645b8-8fa6-46cc-848a-2815e736e9b2\") " pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:24 crc kubenswrapper[4852]: I1210 12:11:24.773571 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:25 crc kubenswrapper[4852]: I1210 12:11:25.442167 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5zwrg"] Dec 10 12:11:30 crc kubenswrapper[4852]: I1210 12:11:30.040732 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:11:30 crc kubenswrapper[4852]: I1210 12:11:30.073355 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" event={"ID":"1d872505-242e-4adc-acdd-756183702aba","Type":"ContainerStarted","Data":"38393b4663e1b2d1e443dfb0df303312efda09238e0280edda6033bb0dd00ffa"} Dec 10 12:11:30 crc kubenswrapper[4852]: I1210 12:11:30.153437 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:11:30 crc kubenswrapper[4852]: I1210 12:11:30.163791 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 10 12:11:30 crc kubenswrapper[4852]: W1210 12:11:30.752483 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a9edae_3cd0_4f71_ba18_9800a7baefef.slice/crio-983fa8f7b86f529a0b5d318b2f39aa03f4a6aadf75435ce116221e4854a8228c WatchSource:0}: Error finding container 983fa8f7b86f529a0b5d318b2f39aa03f4a6aadf75435ce116221e4854a8228c: Status 404 returned error can't find the container with id 983fa8f7b86f529a0b5d318b2f39aa03f4a6aadf75435ce116221e4854a8228c Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.080694 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"06dd4615-ecfb-4e00-9dcf-ee18317d1f95","Type":"ContainerStarted","Data":"f1a611b2c5d49be8795dc30be725e1e46d336eabb5f21cc96ee583c723b8b285"} Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.081920 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09a9edae-3cd0-4f71-ba18-9800a7baefef","Type":"ContainerStarted","Data":"983fa8f7b86f529a0b5d318b2f39aa03f4a6aadf75435ce116221e4854a8228c"} Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.082813 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15a1ed1e-209b-4c71-b15f-44caaec70e93","Type":"ContainerStarted","Data":"a17be7876e80540d338945b08f519ab20c337a2ab5d470a78b2f736639715b62"} Dec 10 12:11:31 crc kubenswrapper[4852]: E1210 12:11:31.148479 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 10 12:11:31 crc kubenswrapper[4852]: E1210 12:11:31.148640 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4svpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-jthws_openstack(ab983f60-c76c-44f2-bd26-99206a0d42b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:11:31 crc kubenswrapper[4852]: E1210 12:11:31.149804 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" podUID="ab983f60-c76c-44f2-bd26-99206a0d42b7" Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.259150 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-srnc7"] Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.296090 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.302138 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gkhdx"] Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.309292 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.438537 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 10 12:11:31 crc kubenswrapper[4852]: E1210 12:11:31.454486 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 10 12:11:31 crc kubenswrapper[4852]: E1210 12:11:31.454621 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjd6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-jkdn9_openstack(ba40c379-ddd2-43fc-8ed6-db6b12a1efe8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:11:31 crc kubenswrapper[4852]: E1210 12:11:31.457924 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" podUID="ba40c379-ddd2-43fc-8ed6-db6b12a1efe8" Dec 10 12:11:31 crc kubenswrapper[4852]: I1210 12:11:31.541531 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 10 12:11:31 crc kubenswrapper[4852]: W1210 12:11:31.589572 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6246b317_7d73_49ff_bd8e_f4862a4584c6.slice/crio-30f52ac405b9962adfaa48cbbbff4352b824aa2274f0ef261c0502be4bf04db5 WatchSource:0}: Error finding container 30f52ac405b9962adfaa48cbbbff4352b824aa2274f0ef261c0502be4bf04db5: Status 404 returned error can't find the container with id 30f52ac405b9962adfaa48cbbbff4352b824aa2274f0ef261c0502be4bf04db5 Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.091181 4852 generic.go:334] "Generic (PLEG): container finished" podID="1d872505-242e-4adc-acdd-756183702aba" containerID="5ceacbd6d1013e770d05fe646f2feaee65d35997b19a46b693ae435d4dfe80ba" exitCode=0 Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.091278 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" event={"ID":"1d872505-242e-4adc-acdd-756183702aba","Type":"ContainerDied","Data":"5ceacbd6d1013e770d05fe646f2feaee65d35997b19a46b693ae435d4dfe80ba"} Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.092767 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"681949fa-a426-400e-8f81-475a0555dc08","Type":"ContainerStarted","Data":"19a752e59d7fd623214bbccbb079dfb6e784e5866b40fc5f8f47d72b1cae5e56"} Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.096143 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" event={"ID":"574cc4fc-ebb4-419f-b6a7-1904a92f05a3","Type":"ContainerStarted","Data":"cc1b6b297325652b886eb4958c1fea57557d9c7a44c011e34faa97b23ed3dad5"} Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.097998 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2d466b79-84c0-42e9-8952-8491b4ced74e","Type":"ContainerStarted","Data":"12b801919beb30bb0be95a4ab5919e5a89a963b34c894a87e6c84476e0ccc185"} Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.099263 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gkhdx" event={"ID":"6246b317-7d73-49ff-bd8e-f4862a4584c6","Type":"ContainerStarted","Data":"30f52ac405b9962adfaa48cbbbff4352b824aa2274f0ef261c0502be4bf04db5"} Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.102413 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b16645b8-8fa6-46cc-848a-2815e736e9b2","Type":"ContainerStarted","Data":"47f7a6ae876fef67f7fd0fde1133035767f4f298f01b8d3266efd1d90a2e7a61"} Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.104800 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7a324e51-4ea8-4cca-8cfd-6f64d13cd706","Type":"ContainerStarted","Data":"79f5dc6efd56f0f6dd03b84b307bac038fe68c5dd14162298bb1ad2ad58408a8"} Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.223206 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.424382 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qd68p"] Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.467324 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.490546 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjd6r\" (UniqueName: \"kubernetes.io/projected/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-kube-api-access-bjd6r\") pod \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.490633 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-config\") pod \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.491088 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-dns-svc\") pod \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\" (UID: \"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8\") " Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.491235 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-config" (OuterVolumeSpecName: "config") pod "ba40c379-ddd2-43fc-8ed6-db6b12a1efe8" (UID: "ba40c379-ddd2-43fc-8ed6-db6b12a1efe8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.491514 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba40c379-ddd2-43fc-8ed6-db6b12a1efe8" (UID: "ba40c379-ddd2-43fc-8ed6-db6b12a1efe8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.491910 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.491936 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.493948 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-kube-api-access-bjd6r" (OuterVolumeSpecName: "kube-api-access-bjd6r") pod "ba40c379-ddd2-43fc-8ed6-db6b12a1efe8" (UID: "ba40c379-ddd2-43fc-8ed6-db6b12a1efe8"). InnerVolumeSpecName "kube-api-access-bjd6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.499657 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.593310 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab983f60-c76c-44f2-bd26-99206a0d42b7-config\") pod \"ab983f60-c76c-44f2-bd26-99206a0d42b7\" (UID: \"ab983f60-c76c-44f2-bd26-99206a0d42b7\") " Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.593410 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4svpn\" (UniqueName: \"kubernetes.io/projected/ab983f60-c76c-44f2-bd26-99206a0d42b7-kube-api-access-4svpn\") pod \"ab983f60-c76c-44f2-bd26-99206a0d42b7\" (UID: \"ab983f60-c76c-44f2-bd26-99206a0d42b7\") " Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.593814 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjd6r\" (UniqueName: \"kubernetes.io/projected/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8-kube-api-access-bjd6r\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.593962 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab983f60-c76c-44f2-bd26-99206a0d42b7-config" (OuterVolumeSpecName: "config") pod "ab983f60-c76c-44f2-bd26-99206a0d42b7" (UID: "ab983f60-c76c-44f2-bd26-99206a0d42b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.597034 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab983f60-c76c-44f2-bd26-99206a0d42b7-kube-api-access-4svpn" (OuterVolumeSpecName: "kube-api-access-4svpn") pod "ab983f60-c76c-44f2-bd26-99206a0d42b7" (UID: "ab983f60-c76c-44f2-bd26-99206a0d42b7"). InnerVolumeSpecName "kube-api-access-4svpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.697290 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab983f60-c76c-44f2-bd26-99206a0d42b7-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:32 crc kubenswrapper[4852]: I1210 12:11:32.697844 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4svpn\" (UniqueName: \"kubernetes.io/projected/ab983f60-c76c-44f2-bd26-99206a0d42b7-kube-api-access-4svpn\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.111819 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd68p" event={"ID":"b4670741-ddce-45cb-aa16-8f7c419f0c89","Type":"ContainerStarted","Data":"d28ab34942e687f1c062951bc1632af1b438b4a61bd4ea2ea7c433dd3d5fe241"} Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.112918 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0","Type":"ContainerStarted","Data":"4b0f30bc837206a2862df13776594274cbd91c715fc3fca568b4e2f17237284e"} Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.113978 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.113981 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jkdn9" event={"ID":"ba40c379-ddd2-43fc-8ed6-db6b12a1efe8","Type":"ContainerDied","Data":"2555b407b5147e17bec5873c0420f27989583873e42a9b98b24baeea76797225"} Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.115156 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.115153 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jthws" event={"ID":"ab983f60-c76c-44f2-bd26-99206a0d42b7","Type":"ContainerDied","Data":"c89466ebdbeccaecc0d06b1bb29dba81fdc747e3cd1628d85fc1fb798498ffb2"} Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.116910 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" event={"ID":"1d872505-242e-4adc-acdd-756183702aba","Type":"ContainerStarted","Data":"ea77f26c94cc33aac957f4c873d4145eb53a779c953ed08b5e911dbdea82a9da"} Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.117049 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.143989 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" podStartSLOduration=20.101130848 podStartE2EDuration="22.143970776s" podCreationTimestamp="2025-12-10 12:11:11 +0000 UTC" firstStartedPulling="2025-12-10 12:11:29.636072479 +0000 UTC m=+1175.721597703" lastFinishedPulling="2025-12-10 12:11:31.678912407 +0000 UTC m=+1177.764437631" observedRunningTime="2025-12-10 12:11:33.136088799 +0000 UTC m=+1179.221614023" watchObservedRunningTime="2025-12-10 12:11:33.143970776 +0000 UTC m=+1179.229496000" Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.179392 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jkdn9"] Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.192610 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jkdn9"] Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.209212 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jthws"] Dec 10 12:11:33 crc kubenswrapper[4852]: I1210 12:11:33.214746 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jthws"] Dec 10 12:11:34 crc kubenswrapper[4852]: I1210 12:11:34.125487 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" event={"ID":"574cc4fc-ebb4-419f-b6a7-1904a92f05a3","Type":"ContainerStarted","Data":"dfcc1f3a15e83e9fce1188bc450b866af855af9eb781fe2c25d1a07685bd9e9f"} Dec 10 12:11:34 crc kubenswrapper[4852]: I1210 12:11:34.204528 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab983f60-c76c-44f2-bd26-99206a0d42b7" path="/var/lib/kubelet/pods/ab983f60-c76c-44f2-bd26-99206a0d42b7/volumes" Dec 10 12:11:34 crc kubenswrapper[4852]: I1210 12:11:34.205501 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba40c379-ddd2-43fc-8ed6-db6b12a1efe8" path="/var/lib/kubelet/pods/ba40c379-ddd2-43fc-8ed6-db6b12a1efe8/volumes" Dec 10 12:11:35 crc kubenswrapper[4852]: I1210 12:11:35.136565 4852 generic.go:334] "Generic (PLEG): container finished" podID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerID="dfcc1f3a15e83e9fce1188bc450b866af855af9eb781fe2c25d1a07685bd9e9f" exitCode=0 Dec 10 12:11:35 crc kubenswrapper[4852]: I1210 12:11:35.136613 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" event={"ID":"574cc4fc-ebb4-419f-b6a7-1904a92f05a3","Type":"ContainerDied","Data":"dfcc1f3a15e83e9fce1188bc450b866af855af9eb781fe2c25d1a07685bd9e9f"} Dec 10 12:11:42 crc kubenswrapper[4852]: I1210 12:11:41.672414 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:11:43 crc kubenswrapper[4852]: I1210 12:11:43.193955 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" event={"ID":"574cc4fc-ebb4-419f-b6a7-1904a92f05a3","Type":"ContainerStarted","Data":"3d8587f2cd2e0c6a0d2fabc1650a865b3a8e131467a579943e1c5ae380144bb6"} Dec 10 12:11:43 crc kubenswrapper[4852]: I1210 12:11:43.195488 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:43 crc kubenswrapper[4852]: I1210 12:11:43.221976 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" podStartSLOduration=30.169775271 podStartE2EDuration="32.221951719s" podCreationTimestamp="2025-12-10 12:11:11 +0000 UTC" firstStartedPulling="2025-12-10 12:11:31.581405779 +0000 UTC m=+1177.666931003" lastFinishedPulling="2025-12-10 12:11:33.633582227 +0000 UTC m=+1179.719107451" observedRunningTime="2025-12-10 12:11:43.217598611 +0000 UTC m=+1189.303123865" watchObservedRunningTime="2025-12-10 12:11:43.221951719 +0000 UTC m=+1189.307476943" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.202415 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"06dd4615-ecfb-4e00-9dcf-ee18317d1f95","Type":"ContainerStarted","Data":"cd40047aaf03472fd7292535f78dd9db209dc4e14e7e651993ebaec185e45b37"} Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.204203 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0","Type":"ContainerStarted","Data":"6a0c0a63cf8d3b9e64efa08dd68b7b40f5eb6d00a7f5e5fc02450f709a0b7d93"} Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.820966 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nspk8"] Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.822402 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.824303 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.832846 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nspk8"] Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.919344 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c06b796-7229-47bc-889c-4a78ef3a186a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.919398 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c06b796-7229-47bc-889c-4a78ef3a186a-config\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.919430 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbzr7\" (UniqueName: \"kubernetes.io/projected/2c06b796-7229-47bc-889c-4a78ef3a186a-kube-api-access-wbzr7\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.919453 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c06b796-7229-47bc-889c-4a78ef3a186a-combined-ca-bundle\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.919500 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c06b796-7229-47bc-889c-4a78ef3a186a-ovs-rundir\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.919534 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c06b796-7229-47bc-889c-4a78ef3a186a-ovn-rundir\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.977590 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-srnc7"] Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.995444 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-9ddc9"] Dec 10 12:11:44 crc kubenswrapper[4852]: I1210 12:11:44.997362 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.002545 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.006881 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-9ddc9"] Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.022665 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c06b796-7229-47bc-889c-4a78ef3a186a-combined-ca-bundle\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.022709 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c06b796-7229-47bc-889c-4a78ef3a186a-ovs-rundir\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.022750 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c06b796-7229-47bc-889c-4a78ef3a186a-ovn-rundir\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.022845 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c06b796-7229-47bc-889c-4a78ef3a186a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.022867 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c06b796-7229-47bc-889c-4a78ef3a186a-config\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.022886 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbzr7\" (UniqueName: \"kubernetes.io/projected/2c06b796-7229-47bc-889c-4a78ef3a186a-kube-api-access-wbzr7\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.023668 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2c06b796-7229-47bc-889c-4a78ef3a186a-ovn-rundir\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.023755 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2c06b796-7229-47bc-889c-4a78ef3a186a-ovs-rundir\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.028895 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c06b796-7229-47bc-889c-4a78ef3a186a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.029164 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c06b796-7229-47bc-889c-4a78ef3a186a-combined-ca-bundle\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.029555 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c06b796-7229-47bc-889c-4a78ef3a186a-config\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.039541 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbzr7\" (UniqueName: \"kubernetes.io/projected/2c06b796-7229-47bc-889c-4a78ef3a186a-kube-api-access-wbzr7\") pod \"ovn-controller-metrics-nspk8\" (UID: \"2c06b796-7229-47bc-889c-4a78ef3a186a\") " pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.121597 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-9ddc9"] Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.124682 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8z4\" (UniqueName: \"kubernetes.io/projected/7132d3e1-acac-43fc-a865-8fdee543717c-kube-api-access-fc8z4\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.124739 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.124864 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.124896 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.153516 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-bc6n4"] Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.154893 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.157199 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.169333 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bc6n4"] Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.221471 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gkhdx" event={"ID":"6246b317-7d73-49ff-bd8e-f4862a4584c6","Type":"ContainerStarted","Data":"7c9c89931c21dc84c6b89441c068b8083e8e993eddd124ec8c4cc97522ba256a"} Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.222223 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-gkhdx" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.225548 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b16645b8-8fa6-46cc-848a-2815e736e9b2","Type":"ContainerStarted","Data":"956bf1d11c6e8672a4d0183b690802ad16d339397d1fe5d04d6d1261cc8a68b0"} Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.225647 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-dns-svc\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.225706 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.225781 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.225946 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.226074 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8z4\" (UniqueName: \"kubernetes.io/projected/7132d3e1-acac-43fc-a865-8fdee543717c-kube-api-access-fc8z4\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.226125 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.226218 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-config\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.226269 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl68s\" (UniqueName: \"kubernetes.io/projected/3f42e185-974f-4bfe-95a3-8d9ba45b934a-kube-api-access-gl68s\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.226311 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.226520 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.227458 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.227788 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.228929 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7a324e51-4ea8-4cca-8cfd-6f64d13cd706","Type":"ContainerStarted","Data":"27c5a230a867ab24bbf3a0db2eebf9c9b0d620377917b9a0d010c99ce2ec69c8"} Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.229604 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.231950 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"681949fa-a426-400e-8f81-475a0555dc08","Type":"ContainerStarted","Data":"ba91414b2b8cececfe2dd537993001c8ce1311c25ee723642eaaa47f1e5eed51"} Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.232671 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.235797 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2d466b79-84c0-42e9-8952-8491b4ced74e","Type":"ContainerStarted","Data":"2ee8d97b43312e7b3e889894be4575382d774edddb98d31b3b934b8755cb0bc8"} Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.243965 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gkhdx" podStartSLOduration=13.291699075 podStartE2EDuration="24.243946498s" podCreationTimestamp="2025-12-10 12:11:21 +0000 UTC" firstStartedPulling="2025-12-10 12:11:31.676486496 +0000 UTC m=+1177.762011740" lastFinishedPulling="2025-12-10 12:11:42.628733939 +0000 UTC m=+1188.714259163" observedRunningTime="2025-12-10 12:11:45.240665016 +0000 UTC m=+1191.326190250" watchObservedRunningTime="2025-12-10 12:11:45.243946498 +0000 UTC m=+1191.329471722" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.244375 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd68p" event={"ID":"b4670741-ddce-45cb-aa16-8f7c419f0c89","Type":"ContainerStarted","Data":"608cda3808c4cd21b16d7c2ddd2084a75f7182b7be951f089f744bed1401c9bd"} Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.247573 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8z4\" (UniqueName: \"kubernetes.io/projected/7132d3e1-acac-43fc-a865-8fdee543717c-kube-api-access-fc8z4\") pod \"dnsmasq-dns-5bf47b49b7-9ddc9\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.316895 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.357661348 podStartE2EDuration="29.316869744s" podCreationTimestamp="2025-12-10 12:11:16 +0000 UTC" firstStartedPulling="2025-12-10 12:11:31.676723562 +0000 UTC m=+1177.762248786" lastFinishedPulling="2025-12-10 12:11:42.635931958 +0000 UTC m=+1188.721457182" observedRunningTime="2025-12-10 12:11:45.258972152 +0000 UTC m=+1191.344497406" watchObservedRunningTime="2025-12-10 12:11:45.316869744 +0000 UTC m=+1191.402394968" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.328687 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-config\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.328728 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl68s\" (UniqueName: \"kubernetes.io/projected/3f42e185-974f-4bfe-95a3-8d9ba45b934a-kube-api-access-gl68s\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.328751 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.328823 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-dns-svc\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.328930 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.330251 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-config\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.331275 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.332085 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-dns-svc\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.332132 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.875717608 podStartE2EDuration="28.332111093s" podCreationTimestamp="2025-12-10 12:11:17 +0000 UTC" firstStartedPulling="2025-12-10 12:11:31.591246404 +0000 UTC m=+1177.676771628" lastFinishedPulling="2025-12-10 12:11:44.047639889 +0000 UTC m=+1190.133165113" observedRunningTime="2025-12-10 12:11:45.320876274 +0000 UTC m=+1191.406401508" watchObservedRunningTime="2025-12-10 12:11:45.332111093 +0000 UTC m=+1191.417636317" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.332537 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.365559 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nspk8" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.369890 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl68s\" (UniqueName: \"kubernetes.io/projected/3f42e185-974f-4bfe-95a3-8d9ba45b934a-kube-api-access-gl68s\") pod \"dnsmasq-dns-8554648995-bc6n4\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.529121 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.546404 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:45 crc kubenswrapper[4852]: I1210 12:11:45.954609 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nspk8"] Dec 10 12:11:46 crc kubenswrapper[4852]: I1210 12:11:46.195657 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bc6n4"] Dec 10 12:11:46 crc kubenswrapper[4852]: I1210 12:11:46.202631 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-9ddc9"] Dec 10 12:11:46 crc kubenswrapper[4852]: I1210 12:11:46.253984 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09a9edae-3cd0-4f71-ba18-9800a7baefef","Type":"ContainerStarted","Data":"18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d"} Dec 10 12:11:46 crc kubenswrapper[4852]: I1210 12:11:46.258725 4852 generic.go:334] "Generic (PLEG): container finished" podID="b4670741-ddce-45cb-aa16-8f7c419f0c89" containerID="608cda3808c4cd21b16d7c2ddd2084a75f7182b7be951f089f744bed1401c9bd" exitCode=0 Dec 10 12:11:46 crc kubenswrapper[4852]: I1210 12:11:46.258911 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd68p" event={"ID":"b4670741-ddce-45cb-aa16-8f7c419f0c89","Type":"ContainerDied","Data":"608cda3808c4cd21b16d7c2ddd2084a75f7182b7be951f089f744bed1401c9bd"} Dec 10 12:11:46 crc kubenswrapper[4852]: I1210 12:11:46.262452 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nspk8" event={"ID":"2c06b796-7229-47bc-889c-4a78ef3a186a","Type":"ContainerStarted","Data":"a864d60b738b7d30e116928e749dc8fa79bc671cc84927016f56fa9997446e40"} Dec 10 12:11:46 crc kubenswrapper[4852]: I1210 12:11:46.265135 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15a1ed1e-209b-4c71-b15f-44caaec70e93","Type":"ContainerStarted","Data":"021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27"} Dec 10 12:11:46 crc kubenswrapper[4852]: I1210 12:11:46.265744 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" podUID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerName="dnsmasq-dns" containerID="cri-o://3d8587f2cd2e0c6a0d2fabc1650a865b3a8e131467a579943e1c5ae380144bb6" gracePeriod=10 Dec 10 12:11:47 crc kubenswrapper[4852]: I1210 12:11:47.275187 4852 generic.go:334] "Generic (PLEG): container finished" podID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerID="3d8587f2cd2e0c6a0d2fabc1650a865b3a8e131467a579943e1c5ae380144bb6" exitCode=0 Dec 10 12:11:47 crc kubenswrapper[4852]: I1210 12:11:47.275284 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" event={"ID":"574cc4fc-ebb4-419f-b6a7-1904a92f05a3","Type":"ContainerDied","Data":"3d8587f2cd2e0c6a0d2fabc1650a865b3a8e131467a579943e1c5ae380144bb6"} Dec 10 12:11:48 crc kubenswrapper[4852]: I1210 12:11:48.285537 4852 generic.go:334] "Generic (PLEG): container finished" podID="06dd4615-ecfb-4e00-9dcf-ee18317d1f95" containerID="cd40047aaf03472fd7292535f78dd9db209dc4e14e7e651993ebaec185e45b37" exitCode=0 Dec 10 12:11:48 crc kubenswrapper[4852]: I1210 12:11:48.285582 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"06dd4615-ecfb-4e00-9dcf-ee18317d1f95","Type":"ContainerDied","Data":"cd40047aaf03472fd7292535f78dd9db209dc4e14e7e651993ebaec185e45b37"} Dec 10 12:11:49 crc kubenswrapper[4852]: I1210 12:11:49.293698 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" event={"ID":"7132d3e1-acac-43fc-a865-8fdee543717c","Type":"ContainerStarted","Data":"e7ba9c68060a6fa656053482df90328d4d8c82888ba54cc89fec15974ff0d67d"} Dec 10 12:11:49 crc kubenswrapper[4852]: I1210 12:11:49.294721 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bc6n4" event={"ID":"3f42e185-974f-4bfe-95a3-8d9ba45b934a","Type":"ContainerStarted","Data":"f02078783e4a3b7dcb912aa3d63b4543fdced625779e025b468c9c96fa9dcc19"} Dec 10 12:11:51 crc kubenswrapper[4852]: I1210 12:11:51.374452 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.108725 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.204454 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-config\") pod \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.204531 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xslq\" (UniqueName: \"kubernetes.io/projected/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-kube-api-access-4xslq\") pod \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.204565 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-dns-svc\") pod \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\" (UID: \"574cc4fc-ebb4-419f-b6a7-1904a92f05a3\") " Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.208450 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-kube-api-access-4xslq" (OuterVolumeSpecName: "kube-api-access-4xslq") pod "574cc4fc-ebb4-419f-b6a7-1904a92f05a3" (UID: "574cc4fc-ebb4-419f-b6a7-1904a92f05a3"). InnerVolumeSpecName "kube-api-access-4xslq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.251631 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-config" (OuterVolumeSpecName: "config") pod "574cc4fc-ebb4-419f-b6a7-1904a92f05a3" (UID: "574cc4fc-ebb4-419f-b6a7-1904a92f05a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.256897 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "574cc4fc-ebb4-419f-b6a7-1904a92f05a3" (UID: "574cc4fc-ebb4-419f-b6a7-1904a92f05a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.306783 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.306812 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xslq\" (UniqueName: \"kubernetes.io/projected/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-kube-api-access-4xslq\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.306823 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/574cc4fc-ebb4-419f-b6a7-1904a92f05a3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.344638 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" event={"ID":"574cc4fc-ebb4-419f-b6a7-1904a92f05a3","Type":"ContainerDied","Data":"cc1b6b297325652b886eb4958c1fea57557d9c7a44c011e34faa97b23ed3dad5"} Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.344683 4852 scope.go:117] "RemoveContainer" containerID="3d8587f2cd2e0c6a0d2fabc1650a865b3a8e131467a579943e1c5ae380144bb6" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.344810 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.353981 4852 generic.go:334] "Generic (PLEG): container finished" podID="2d466b79-84c0-42e9-8952-8491b4ced74e" containerID="2ee8d97b43312e7b3e889894be4575382d774edddb98d31b3b934b8755cb0bc8" exitCode=0 Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.354029 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2d466b79-84c0-42e9-8952-8491b4ced74e","Type":"ContainerDied","Data":"2ee8d97b43312e7b3e889894be4575382d774edddb98d31b3b934b8755cb0bc8"} Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.374353 4852 scope.go:117] "RemoveContainer" containerID="dfcc1f3a15e83e9fce1188bc450b866af855af9eb781fe2c25d1a07685bd9e9f" Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.396684 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-srnc7"] Dec 10 12:11:55 crc kubenswrapper[4852]: I1210 12:11:55.403655 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-srnc7"] Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.181302 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" path="/var/lib/kubelet/pods/574cc4fc-ebb4-419f-b6a7-1904a92f05a3/volumes" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.364991 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nspk8" event={"ID":"2c06b796-7229-47bc-889c-4a78ef3a186a","Type":"ContainerStarted","Data":"b3e63dd710c833f757bdd6410169848cad27cfe4c1c281333783ed8c16d824fc"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.366776 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"06dd4615-ecfb-4e00-9dcf-ee18317d1f95","Type":"ContainerStarted","Data":"b92e90c7d68224b2785e0ac92f14f892a65937832ef10d25ee9097f4b4121b89"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.371307 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2d466b79-84c0-42e9-8952-8491b4ced74e","Type":"ContainerStarted","Data":"463f47b9470ced4e4253b180f805c14bb485ded3151f87ef6d1b58a0be7f04d5"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.373792 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd68p" event={"ID":"b4670741-ddce-45cb-aa16-8f7c419f0c89","Type":"ContainerStarted","Data":"00df75be39c9fffde3d5d47035e4ecd1888393fc6c75b0cfb54bca82792d5c51"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.373818 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd68p" event={"ID":"b4670741-ddce-45cb-aa16-8f7c419f0c89","Type":"ContainerStarted","Data":"34b4ab8f72a69c0b5d82fcbf11cdc5413c7e581095c6322e2c3c13b08c116e30"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.374107 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.374131 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.391673 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nspk8" podStartSLOduration=2.24801037 podStartE2EDuration="12.391652309s" podCreationTimestamp="2025-12-10 12:11:44 +0000 UTC" firstStartedPulling="2025-12-10 12:11:45.98174792 +0000 UTC m=+1192.067273164" lastFinishedPulling="2025-12-10 12:11:56.125389879 +0000 UTC m=+1202.210915103" observedRunningTime="2025-12-10 12:11:56.38967071 +0000 UTC m=+1202.475195944" watchObservedRunningTime="2025-12-10 12:11:56.391652309 +0000 UTC m=+1202.477177533" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.393662 4852 generic.go:334] "Generic (PLEG): container finished" podID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" containerID="d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d" exitCode=0 Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.393743 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bc6n4" event={"ID":"3f42e185-974f-4bfe-95a3-8d9ba45b934a","Type":"ContainerDied","Data":"d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.413925 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b16645b8-8fa6-46cc-848a-2815e736e9b2","Type":"ContainerStarted","Data":"f9b0dd9d39fa01b24ac3114656f19ccd27ea14df715e913b7528f014bdfee409"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.427613 4852 generic.go:334] "Generic (PLEG): container finished" podID="7132d3e1-acac-43fc-a865-8fdee543717c" containerID="91337580496ef3924924efcd22cfe28e038e70a81daca16ccf241f5848a1a3b1" exitCode=0 Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.427715 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" event={"ID":"7132d3e1-acac-43fc-a865-8fdee543717c","Type":"ContainerDied","Data":"91337580496ef3924924efcd22cfe28e038e70a81daca16ccf241f5848a1a3b1"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.429870 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0","Type":"ContainerStarted","Data":"0120f490fc903864e7f15f78bc8150281fad4c4d964e139186241d0ab1d97b9b"} Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.446501 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=32.506504977 podStartE2EDuration="43.446478574s" podCreationTimestamp="2025-12-10 12:11:13 +0000 UTC" firstStartedPulling="2025-12-10 12:11:31.591121881 +0000 UTC m=+1177.676647105" lastFinishedPulling="2025-12-10 12:11:42.531095438 +0000 UTC m=+1188.616620702" observedRunningTime="2025-12-10 12:11:56.418013595 +0000 UTC m=+1202.503538819" watchObservedRunningTime="2025-12-10 12:11:56.446478574 +0000 UTC m=+1202.532003808" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.461091 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=32.418398343 podStartE2EDuration="42.461071117s" podCreationTimestamp="2025-12-10 12:11:14 +0000 UTC" firstStartedPulling="2025-12-10 12:11:30.766352103 +0000 UTC m=+1176.851877337" lastFinishedPulling="2025-12-10 12:11:40.809024897 +0000 UTC m=+1186.894550111" observedRunningTime="2025-12-10 12:11:56.438476825 +0000 UTC m=+1202.524002049" watchObservedRunningTime="2025-12-10 12:11:56.461071117 +0000 UTC m=+1202.546596341" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.476030 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qd68p" podStartSLOduration=25.271419962 podStartE2EDuration="35.476011569s" podCreationTimestamp="2025-12-10 12:11:21 +0000 UTC" firstStartedPulling="2025-12-10 12:11:32.433412853 +0000 UTC m=+1178.518938067" lastFinishedPulling="2025-12-10 12:11:42.63800445 +0000 UTC m=+1188.723529674" observedRunningTime="2025-12-10 12:11:56.463352684 +0000 UTC m=+1202.548877908" watchObservedRunningTime="2025-12-10 12:11:56.476011569 +0000 UTC m=+1202.561536793" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.523120 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.651898844 podStartE2EDuration="35.523094492s" podCreationTimestamp="2025-12-10 12:11:21 +0000 UTC" firstStartedPulling="2025-12-10 12:11:32.247022162 +0000 UTC m=+1178.332547386" lastFinishedPulling="2025-12-10 12:11:55.11821781 +0000 UTC m=+1201.203743034" observedRunningTime="2025-12-10 12:11:56.496365136 +0000 UTC m=+1202.581890370" watchObservedRunningTime="2025-12-10 12:11:56.523094492 +0000 UTC m=+1202.608619716" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.541741 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.109700434 podStartE2EDuration="33.541721226s" podCreationTimestamp="2025-12-10 12:11:23 +0000 UTC" firstStartedPulling="2025-12-10 12:11:31.676491006 +0000 UTC m=+1177.762016230" lastFinishedPulling="2025-12-10 12:11:55.108511798 +0000 UTC m=+1201.194037022" observedRunningTime="2025-12-10 12:11:56.539724916 +0000 UTC m=+1202.625250150" watchObservedRunningTime="2025-12-10 12:11:56.541721226 +0000 UTC m=+1202.627246450" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.730534 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.872270 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8z4\" (UniqueName: \"kubernetes.io/projected/7132d3e1-acac-43fc-a865-8fdee543717c-kube-api-access-fc8z4\") pod \"7132d3e1-acac-43fc-a865-8fdee543717c\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.872386 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-ovsdbserver-nb\") pod \"7132d3e1-acac-43fc-a865-8fdee543717c\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.872444 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config\") pod \"7132d3e1-acac-43fc-a865-8fdee543717c\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.872539 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-dns-svc\") pod \"7132d3e1-acac-43fc-a865-8fdee543717c\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.877622 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7132d3e1-acac-43fc-a865-8fdee543717c-kube-api-access-fc8z4" (OuterVolumeSpecName: "kube-api-access-fc8z4") pod "7132d3e1-acac-43fc-a865-8fdee543717c" (UID: "7132d3e1-acac-43fc-a865-8fdee543717c"). InnerVolumeSpecName "kube-api-access-fc8z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.892593 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7132d3e1-acac-43fc-a865-8fdee543717c" (UID: "7132d3e1-acac-43fc-a865-8fdee543717c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:11:56 crc kubenswrapper[4852]: E1210 12:11:56.894171 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config podName:7132d3e1-acac-43fc-a865-8fdee543717c nodeName:}" failed. No retries permitted until 2025-12-10 12:11:57.394147011 +0000 UTC m=+1203.479672235 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config") pod "7132d3e1-acac-43fc-a865-8fdee543717c" (UID: "7132d3e1-acac-43fc-a865-8fdee543717c") : error deleting /var/lib/kubelet/pods/7132d3e1-acac-43fc-a865-8fdee543717c/volume-subpaths: remove /var/lib/kubelet/pods/7132d3e1-acac-43fc-a865-8fdee543717c/volume-subpaths: no such file or directory Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.894518 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7132d3e1-acac-43fc-a865-8fdee543717c" (UID: "7132d3e1-acac-43fc-a865-8fdee543717c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.974936 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8z4\" (UniqueName: \"kubernetes.io/projected/7132d3e1-acac-43fc-a865-8fdee543717c-kube-api-access-fc8z4\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.974971 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.974981 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:56 crc kubenswrapper[4852]: I1210 12:11:56.993528 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-srnc7" podUID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: i/o timeout" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.439295 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" event={"ID":"7132d3e1-acac-43fc-a865-8fdee543717c","Type":"ContainerDied","Data":"e7ba9c68060a6fa656053482df90328d4d8c82888ba54cc89fec15974ff0d67d"} Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.439353 4852 scope.go:117] "RemoveContainer" containerID="91337580496ef3924924efcd22cfe28e038e70a81daca16ccf241f5848a1a3b1" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.439307 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-9ddc9" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.443129 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bc6n4" event={"ID":"3f42e185-974f-4bfe-95a3-8d9ba45b934a","Type":"ContainerStarted","Data":"0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a"} Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.443175 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.481991 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config\") pod \"7132d3e1-acac-43fc-a865-8fdee543717c\" (UID: \"7132d3e1-acac-43fc-a865-8fdee543717c\") " Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.482542 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config" (OuterVolumeSpecName: "config") pod "7132d3e1-acac-43fc-a865-8fdee543717c" (UID: "7132d3e1-acac-43fc-a865-8fdee543717c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.499304 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-bc6n4" podStartSLOduration=12.499287289 podStartE2EDuration="12.499287289s" podCreationTimestamp="2025-12-10 12:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:11:57.498244813 +0000 UTC m=+1203.583770037" watchObservedRunningTime="2025-12-10 12:11:57.499287289 +0000 UTC m=+1203.584812513" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.583730 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132d3e1-acac-43fc-a865-8fdee543717c-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.697458 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.774361 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.791835 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-9ddc9"] Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.799326 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-9ddc9"] Dec 10 12:11:57 crc kubenswrapper[4852]: I1210 12:11:57.816731 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.016273 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.054903 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bc6n4"] Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.097330 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-cfm8q"] Dec 10 12:11:58 crc kubenswrapper[4852]: E1210 12:11:58.100994 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7132d3e1-acac-43fc-a865-8fdee543717c" containerName="init" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.101028 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="7132d3e1-acac-43fc-a865-8fdee543717c" containerName="init" Dec 10 12:11:58 crc kubenswrapper[4852]: E1210 12:11:58.101052 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerName="dnsmasq-dns" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.101060 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerName="dnsmasq-dns" Dec 10 12:11:58 crc kubenswrapper[4852]: E1210 12:11:58.101077 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerName="init" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.101085 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerName="init" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.101419 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="7132d3e1-acac-43fc-a865-8fdee543717c" containerName="init" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.101443 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="574cc4fc-ebb4-419f-b6a7-1904a92f05a3" containerName="dnsmasq-dns" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.108011 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.119359 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-cfm8q"] Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.181809 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7132d3e1-acac-43fc-a865-8fdee543717c" path="/var/lib/kubelet/pods/7132d3e1-acac-43fc-a865-8fdee543717c/volumes" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.191462 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-config\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.191503 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.191523 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.191554 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pz6l\" (UniqueName: \"kubernetes.io/projected/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-kube-api-access-6pz6l\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.191574 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.293489 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-config\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.293568 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.293596 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.293640 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pz6l\" (UniqueName: \"kubernetes.io/projected/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-kube-api-access-6pz6l\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.293668 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.295214 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.295353 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-config\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.295686 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.296197 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.314244 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pz6l\" (UniqueName: \"kubernetes.io/projected/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-kube-api-access-6pz6l\") pod \"dnsmasq-dns-b8fbc5445-cfm8q\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.427882 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.461098 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.511884 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.697280 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.752430 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:58 crc kubenswrapper[4852]: I1210 12:11:58.970772 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-cfm8q"] Dec 10 12:11:58 crc kubenswrapper[4852]: W1210 12:11:58.979539 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e1d381e_04cb_4b4f_bca5_d091b0b10dbe.slice/crio-b5a6fb2931f3e332fcca6ed8a268cd06d53ffa4a8dc9b4953c65facd24051092 WatchSource:0}: Error finding container b5a6fb2931f3e332fcca6ed8a268cd06d53ffa4a8dc9b4953c65facd24051092: Status 404 returned error can't find the container with id b5a6fb2931f3e332fcca6ed8a268cd06d53ffa4a8dc9b4953c65facd24051092 Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.156154 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.161029 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.162814 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.163221 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-s7jkt" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.164056 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.164293 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.179120 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.208669 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.208895 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.208992 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/41d04c65-c8a1-472a-bc74-6b20bec61fbc-lock\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.209059 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489ml\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-kube-api-access-489ml\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.209139 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/41d04c65-c8a1-472a-bc74-6b20bec61fbc-cache\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.310734 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.311220 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.311665 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/41d04c65-c8a1-472a-bc74-6b20bec61fbc-lock\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.312257 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-489ml\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-kube-api-access-489ml\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.312696 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/41d04c65-c8a1-472a-bc74-6b20bec61fbc-cache\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.312199 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/41d04c65-c8a1-472a-bc74-6b20bec61fbc-lock\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.311625 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: E1210 12:11:59.311167 4852 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 10 12:11:59 crc kubenswrapper[4852]: E1210 12:11:59.314062 4852 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 10 12:11:59 crc kubenswrapper[4852]: E1210 12:11:59.314136 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift podName:41d04c65-c8a1-472a-bc74-6b20bec61fbc nodeName:}" failed. No retries permitted until 2025-12-10 12:11:59.814117119 +0000 UTC m=+1205.899642333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift") pod "swift-storage-0" (UID: "41d04c65-c8a1-472a-bc74-6b20bec61fbc") : configmap "swift-ring-files" not found Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.313034 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/41d04c65-c8a1-472a-bc74-6b20bec61fbc-cache\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.337035 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-489ml\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-kube-api-access-489ml\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.338049 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.471131 4852 generic.go:334] "Generic (PLEG): container finished" podID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerID="ee8c936add246460659ca312173b12517adc514e1ffdf72843f884b96cdec1de" exitCode=0 Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.471246 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" event={"ID":"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe","Type":"ContainerDied","Data":"ee8c936add246460659ca312173b12517adc514e1ffdf72843f884b96cdec1de"} Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.472133 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" event={"ID":"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe","Type":"ContainerStarted","Data":"b5a6fb2931f3e332fcca6ed8a268cd06d53ffa4a8dc9b4953c65facd24051092"} Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.472334 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-bc6n4" podUID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" containerName="dnsmasq-dns" containerID="cri-o://0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a" gracePeriod=10 Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.526164 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.821639 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:11:59 crc kubenswrapper[4852]: E1210 12:11:59.821849 4852 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 10 12:11:59 crc kubenswrapper[4852]: E1210 12:11:59.822072 4852 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 10 12:11:59 crc kubenswrapper[4852]: E1210 12:11:59.822140 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift podName:41d04c65-c8a1-472a-bc74-6b20bec61fbc nodeName:}" failed. No retries permitted until 2025-12-10 12:12:00.822120608 +0000 UTC m=+1206.907645842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift") pod "swift-storage-0" (UID: "41d04c65-c8a1-472a-bc74-6b20bec61fbc") : configmap "swift-ring-files" not found Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.823714 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.827375 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.831714 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.831922 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.832031 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.832140 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-2xsbp" Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.842335 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 10 12:11:59 crc kubenswrapper[4852]: I1210 12:11:59.898409 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.024953 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-config\") pod \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025012 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl68s\" (UniqueName: \"kubernetes.io/projected/3f42e185-974f-4bfe-95a3-8d9ba45b934a-kube-api-access-gl68s\") pod \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025067 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-sb\") pod \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025129 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-dns-svc\") pod \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025182 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-nb\") pod \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\" (UID: \"3f42e185-974f-4bfe-95a3-8d9ba45b934a\") " Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025372 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025401 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4999290-010a-43e8-9622-04a117f98f3f-config\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025435 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4999290-010a-43e8-9622-04a117f98f3f-scripts\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025464 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4999290-010a-43e8-9622-04a117f98f3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025511 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025536 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtqsq\" (UniqueName: \"kubernetes.io/projected/a4999290-010a-43e8-9622-04a117f98f3f-kube-api-access-gtqsq\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.025599 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.030701 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f42e185-974f-4bfe-95a3-8d9ba45b934a-kube-api-access-gl68s" (OuterVolumeSpecName: "kube-api-access-gl68s") pod "3f42e185-974f-4bfe-95a3-8d9ba45b934a" (UID: "3f42e185-974f-4bfe-95a3-8d9ba45b934a"). InnerVolumeSpecName "kube-api-access-gl68s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.063043 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-config" (OuterVolumeSpecName: "config") pod "3f42e185-974f-4bfe-95a3-8d9ba45b934a" (UID: "3f42e185-974f-4bfe-95a3-8d9ba45b934a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.064662 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f42e185-974f-4bfe-95a3-8d9ba45b934a" (UID: "3f42e185-974f-4bfe-95a3-8d9ba45b934a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.070169 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f42e185-974f-4bfe-95a3-8d9ba45b934a" (UID: "3f42e185-974f-4bfe-95a3-8d9ba45b934a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.070865 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f42e185-974f-4bfe-95a3-8d9ba45b934a" (UID: "3f42e185-974f-4bfe-95a3-8d9ba45b934a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.126599 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4999290-010a-43e8-9622-04a117f98f3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.126672 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.126704 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtqsq\" (UniqueName: \"kubernetes.io/projected/a4999290-010a-43e8-9622-04a117f98f3f-kube-api-access-gtqsq\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.126768 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.126854 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.126884 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4999290-010a-43e8-9622-04a117f98f3f-config\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.126917 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4999290-010a-43e8-9622-04a117f98f3f-scripts\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.127031 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.127178 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4999290-010a-43e8-9622-04a117f98f3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.127367 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.127393 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.127406 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl68s\" (UniqueName: \"kubernetes.io/projected/3f42e185-974f-4bfe-95a3-8d9ba45b934a-kube-api-access-gl68s\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.127417 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f42e185-974f-4bfe-95a3-8d9ba45b934a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.128155 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4999290-010a-43e8-9622-04a117f98f3f-config\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.128287 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4999290-010a-43e8-9622-04a117f98f3f-scripts\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.131581 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.135898 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.141842 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4999290-010a-43e8-9622-04a117f98f3f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.146782 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtqsq\" (UniqueName: \"kubernetes.io/projected/a4999290-010a-43e8-9622-04a117f98f3f-kube-api-access-gtqsq\") pod \"ovn-northd-0\" (UID: \"a4999290-010a-43e8-9622-04a117f98f3f\") " pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.154836 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.495559 4852 generic.go:334] "Generic (PLEG): container finished" podID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" containerID="0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a" exitCode=0 Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.496288 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bc6n4" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.496220 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bc6n4" event={"ID":"3f42e185-974f-4bfe-95a3-8d9ba45b934a","Type":"ContainerDied","Data":"0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a"} Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.497827 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bc6n4" event={"ID":"3f42e185-974f-4bfe-95a3-8d9ba45b934a","Type":"ContainerDied","Data":"f02078783e4a3b7dcb912aa3d63b4543fdced625779e025b468c9c96fa9dcc19"} Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.498018 4852 scope.go:117] "RemoveContainer" containerID="0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.503499 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" event={"ID":"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe","Type":"ContainerStarted","Data":"dd6985a6e7fa19d43bfb587d3319fece92772e7d9aa409320314fa5f34730405"} Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.530670 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" podStartSLOduration=2.530653681 podStartE2EDuration="2.530653681s" podCreationTimestamp="2025-12-10 12:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:00.528714163 +0000 UTC m=+1206.614239397" watchObservedRunningTime="2025-12-10 12:12:00.530653681 +0000 UTC m=+1206.616178905" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.534337 4852 scope.go:117] "RemoveContainer" containerID="d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.558583 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bc6n4"] Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.566855 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bc6n4"] Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.567513 4852 scope.go:117] "RemoveContainer" containerID="0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a" Dec 10 12:12:00 crc kubenswrapper[4852]: E1210 12:12:00.569063 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a\": container with ID starting with 0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a not found: ID does not exist" containerID="0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.569115 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a"} err="failed to get container status \"0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a\": rpc error: code = NotFound desc = could not find container \"0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a\": container with ID starting with 0696bce88d206fef6d7675d30668dd7dea9169909092980def3248df42cb116a not found: ID does not exist" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.569150 4852 scope.go:117] "RemoveContainer" containerID="d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d" Dec 10 12:12:00 crc kubenswrapper[4852]: E1210 12:12:00.570339 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d\": container with ID starting with d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d not found: ID does not exist" containerID="d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.570411 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d"} err="failed to get container status \"d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d\": rpc error: code = NotFound desc = could not find container \"d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d\": container with ID starting with d3c2f1eaed250db23f9c0e390ddbc60939db57aec5e44b9388ed4355e90e798d not found: ID does not exist" Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.621894 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 10 12:12:00 crc kubenswrapper[4852]: W1210 12:12:00.624840 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4999290_010a_43e8_9622_04a117f98f3f.slice/crio-1e60d018f574795fb378ed06b8009c5a8602523bab325d27b746a349c75ccb0a WatchSource:0}: Error finding container 1e60d018f574795fb378ed06b8009c5a8602523bab325d27b746a349c75ccb0a: Status 404 returned error can't find the container with id 1e60d018f574795fb378ed06b8009c5a8602523bab325d27b746a349c75ccb0a Dec 10 12:12:00 crc kubenswrapper[4852]: I1210 12:12:00.843159 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:12:00 crc kubenswrapper[4852]: E1210 12:12:00.843750 4852 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 10 12:12:00 crc kubenswrapper[4852]: E1210 12:12:00.843770 4852 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 10 12:12:00 crc kubenswrapper[4852]: E1210 12:12:00.843831 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift podName:41d04c65-c8a1-472a-bc74-6b20bec61fbc nodeName:}" failed. No retries permitted until 2025-12-10 12:12:02.843801569 +0000 UTC m=+1208.929326783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift") pod "swift-storage-0" (UID: "41d04c65-c8a1-472a-bc74-6b20bec61fbc") : configmap "swift-ring-files" not found Dec 10 12:12:01 crc kubenswrapper[4852]: I1210 12:12:01.510989 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a4999290-010a-43e8-9622-04a117f98f3f","Type":"ContainerStarted","Data":"1e60d018f574795fb378ed06b8009c5a8602523bab325d27b746a349c75ccb0a"} Dec 10 12:12:01 crc kubenswrapper[4852]: I1210 12:12:01.512091 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:12:02 crc kubenswrapper[4852]: I1210 12:12:02.188974 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" path="/var/lib/kubelet/pods/3f42e185-974f-4bfe-95a3-8d9ba45b934a/volumes" Dec 10 12:12:02 crc kubenswrapper[4852]: I1210 12:12:02.520795 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a4999290-010a-43e8-9622-04a117f98f3f","Type":"ContainerStarted","Data":"92a60d71d72461977fffe786462395a47ddf53885f2b461b31732e46b04bd2de"} Dec 10 12:12:02 crc kubenswrapper[4852]: I1210 12:12:02.874786 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:12:02 crc kubenswrapper[4852]: E1210 12:12:02.875024 4852 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 10 12:12:02 crc kubenswrapper[4852]: E1210 12:12:02.875066 4852 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 10 12:12:02 crc kubenswrapper[4852]: E1210 12:12:02.875135 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift podName:41d04c65-c8a1-472a-bc74-6b20bec61fbc nodeName:}" failed. No retries permitted until 2025-12-10 12:12:06.875113219 +0000 UTC m=+1212.960638443 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift") pod "swift-storage-0" (UID: "41d04c65-c8a1-472a-bc74-6b20bec61fbc") : configmap "swift-ring-files" not found Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.181120 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5w2l8"] Dec 10 12:12:03 crc kubenswrapper[4852]: E1210 12:12:03.181769 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" containerName="dnsmasq-dns" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.181818 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" containerName="dnsmasq-dns" Dec 10 12:12:03 crc kubenswrapper[4852]: E1210 12:12:03.181840 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" containerName="init" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.181849 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" containerName="init" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.182061 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f42e185-974f-4bfe-95a3-8d9ba45b934a" containerName="dnsmasq-dns" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.182697 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.187304 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.187349 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.187427 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.193455 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5w2l8"] Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.281057 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-etc-swift\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.281135 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-combined-ca-bundle\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.281154 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xj5b\" (UniqueName: \"kubernetes.io/projected/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-kube-api-access-5xj5b\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.281176 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-ring-data-devices\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.281268 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-swiftconf\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.281289 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-scripts\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.281337 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-dispersionconf\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.382635 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-dispersionconf\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.383020 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-etc-swift\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.383151 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-combined-ca-bundle\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.383307 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xj5b\" (UniqueName: \"kubernetes.io/projected/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-kube-api-access-5xj5b\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.383418 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-ring-data-devices\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.383538 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-swiftconf\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.383645 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-scripts\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.383582 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-etc-swift\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.384222 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-ring-data-devices\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.384378 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-scripts\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.388721 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-dispersionconf\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.388822 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-combined-ca-bundle\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.389538 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-swiftconf\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.400188 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xj5b\" (UniqueName: \"kubernetes.io/projected/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-kube-api-access-5xj5b\") pod \"swift-ring-rebalance-5w2l8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.500203 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.531607 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a4999290-010a-43e8-9622-04a117f98f3f","Type":"ContainerStarted","Data":"f662ccaffb2f310748e342fcc4f78f88a83f79601fb9885d78418cac7636b241"} Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.532933 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.560623 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.12095427 podStartE2EDuration="4.560589167s" podCreationTimestamp="2025-12-10 12:11:59 +0000 UTC" firstStartedPulling="2025-12-10 12:12:00.627945874 +0000 UTC m=+1206.713471098" lastFinishedPulling="2025-12-10 12:12:02.067580771 +0000 UTC m=+1208.153105995" observedRunningTime="2025-12-10 12:12:03.557908071 +0000 UTC m=+1209.643433315" watchObservedRunningTime="2025-12-10 12:12:03.560589167 +0000 UTC m=+1209.646114391" Dec 10 12:12:03 crc kubenswrapper[4852]: W1210 12:12:03.966914 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce2fc32_02ab_4099_ac2f_c0eeca72f9a8.slice/crio-bf7d9a1da40f0c53a26a93fa9df3d8fb302e58b6dac9ecc9e6934d7ba26ef583 WatchSource:0}: Error finding container bf7d9a1da40f0c53a26a93fa9df3d8fb302e58b6dac9ecc9e6934d7ba26ef583: Status 404 returned error can't find the container with id bf7d9a1da40f0c53a26a93fa9df3d8fb302e58b6dac9ecc9e6934d7ba26ef583 Dec 10 12:12:03 crc kubenswrapper[4852]: I1210 12:12:03.967313 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5w2l8"] Dec 10 12:12:04 crc kubenswrapper[4852]: I1210 12:12:04.546402 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5w2l8" event={"ID":"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8","Type":"ContainerStarted","Data":"bf7d9a1da40f0c53a26a93fa9df3d8fb302e58b6dac9ecc9e6934d7ba26ef583"} Dec 10 12:12:04 crc kubenswrapper[4852]: I1210 12:12:04.715990 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 10 12:12:04 crc kubenswrapper[4852]: I1210 12:12:04.716037 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 10 12:12:04 crc kubenswrapper[4852]: I1210 12:12:04.788294 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 10 12:12:05 crc kubenswrapper[4852]: I1210 12:12:05.632493 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.040712 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f1e3-account-create-update-kn7lg"] Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.041980 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.044101 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.052807 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f1e3-account-create-update-kn7lg"] Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.130200 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.130259 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.131040 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qfvrl"] Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.132122 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.133368 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stps\" (UniqueName: \"kubernetes.io/projected/220ec4be-95ca-4ace-967b-f7bf22c7d11a-kube-api-access-8stps\") pod \"keystone-f1e3-account-create-update-kn7lg\" (UID: \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\") " pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.133481 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220ec4be-95ca-4ace-967b-f7bf22c7d11a-operator-scripts\") pod \"keystone-f1e3-account-create-update-kn7lg\" (UID: \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\") " pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.142522 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qfvrl"] Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.208394 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.235309 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220ec4be-95ca-4ace-967b-f7bf22c7d11a-operator-scripts\") pod \"keystone-f1e3-account-create-update-kn7lg\" (UID: \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\") " pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.235438 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-operator-scripts\") pod \"keystone-db-create-qfvrl\" (UID: \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\") " pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.235468 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2ldl\" (UniqueName: \"kubernetes.io/projected/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-kube-api-access-t2ldl\") pod \"keystone-db-create-qfvrl\" (UID: \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\") " pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.235491 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8stps\" (UniqueName: \"kubernetes.io/projected/220ec4be-95ca-4ace-967b-f7bf22c7d11a-kube-api-access-8stps\") pod \"keystone-f1e3-account-create-update-kn7lg\" (UID: \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\") " pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.236057 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220ec4be-95ca-4ace-967b-f7bf22c7d11a-operator-scripts\") pod \"keystone-f1e3-account-create-update-kn7lg\" (UID: \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\") " pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.255178 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stps\" (UniqueName: \"kubernetes.io/projected/220ec4be-95ca-4ace-967b-f7bf22c7d11a-kube-api-access-8stps\") pod \"keystone-f1e3-account-create-update-kn7lg\" (UID: \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\") " pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.331700 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4cwxn"] Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.332936 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.336511 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-operator-scripts\") pod \"keystone-db-create-qfvrl\" (UID: \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\") " pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.336592 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2ldl\" (UniqueName: \"kubernetes.io/projected/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-kube-api-access-t2ldl\") pod \"keystone-db-create-qfvrl\" (UID: \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\") " pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.337747 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-operator-scripts\") pod \"keystone-db-create-qfvrl\" (UID: \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\") " pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.341246 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4cwxn"] Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.348851 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3a6c-account-create-update-gcs96"] Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.350169 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.359598 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3a6c-account-create-update-gcs96"] Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.363486 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.364351 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2ldl\" (UniqueName: \"kubernetes.io/projected/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-kube-api-access-t2ldl\") pod \"keystone-db-create-qfvrl\" (UID: \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\") " pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.406810 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.438075 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8ae564-c6ed-47c8-9952-f18311d280c5-operator-scripts\") pod \"placement-3a6c-account-create-update-gcs96\" (UID: \"2f8ae564-c6ed-47c8-9952-f18311d280c5\") " pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.438127 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hh8k\" (UniqueName: \"kubernetes.io/projected/2f8ae564-c6ed-47c8-9952-f18311d280c5-kube-api-access-5hh8k\") pod \"placement-3a6c-account-create-update-gcs96\" (UID: \"2f8ae564-c6ed-47c8-9952-f18311d280c5\") " pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.438189 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5s4r\" (UniqueName: \"kubernetes.io/projected/b019d2c6-723d-49ac-953b-a5b624876c5c-kube-api-access-b5s4r\") pod \"placement-db-create-4cwxn\" (UID: \"b019d2c6-723d-49ac-953b-a5b624876c5c\") " pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.438266 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b019d2c6-723d-49ac-953b-a5b624876c5c-operator-scripts\") pod \"placement-db-create-4cwxn\" (UID: \"b019d2c6-723d-49ac-953b-a5b624876c5c\") " pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.452834 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.539845 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b019d2c6-723d-49ac-953b-a5b624876c5c-operator-scripts\") pod \"placement-db-create-4cwxn\" (UID: \"b019d2c6-723d-49ac-953b-a5b624876c5c\") " pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.540010 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8ae564-c6ed-47c8-9952-f18311d280c5-operator-scripts\") pod \"placement-3a6c-account-create-update-gcs96\" (UID: \"2f8ae564-c6ed-47c8-9952-f18311d280c5\") " pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.540040 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hh8k\" (UniqueName: \"kubernetes.io/projected/2f8ae564-c6ed-47c8-9952-f18311d280c5-kube-api-access-5hh8k\") pod \"placement-3a6c-account-create-update-gcs96\" (UID: \"2f8ae564-c6ed-47c8-9952-f18311d280c5\") " pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.540092 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5s4r\" (UniqueName: \"kubernetes.io/projected/b019d2c6-723d-49ac-953b-a5b624876c5c-kube-api-access-b5s4r\") pod \"placement-db-create-4cwxn\" (UID: \"b019d2c6-723d-49ac-953b-a5b624876c5c\") " pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.540921 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b019d2c6-723d-49ac-953b-a5b624876c5c-operator-scripts\") pod \"placement-db-create-4cwxn\" (UID: \"b019d2c6-723d-49ac-953b-a5b624876c5c\") " pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.541158 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8ae564-c6ed-47c8-9952-f18311d280c5-operator-scripts\") pod \"placement-3a6c-account-create-update-gcs96\" (UID: \"2f8ae564-c6ed-47c8-9952-f18311d280c5\") " pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.558940 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5s4r\" (UniqueName: \"kubernetes.io/projected/b019d2c6-723d-49ac-953b-a5b624876c5c-kube-api-access-b5s4r\") pod \"placement-db-create-4cwxn\" (UID: \"b019d2c6-723d-49ac-953b-a5b624876c5c\") " pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.559101 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hh8k\" (UniqueName: \"kubernetes.io/projected/2f8ae564-c6ed-47c8-9952-f18311d280c5-kube-api-access-5hh8k\") pod \"placement-3a6c-account-create-update-gcs96\" (UID: \"2f8ae564-c6ed-47c8-9952-f18311d280c5\") " pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.635790 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.648395 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.695344 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:06 crc kubenswrapper[4852]: I1210 12:12:06.945651 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:12:06 crc kubenswrapper[4852]: E1210 12:12:06.945875 4852 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 10 12:12:06 crc kubenswrapper[4852]: E1210 12:12:06.946022 4852 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 10 12:12:06 crc kubenswrapper[4852]: E1210 12:12:06.946086 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift podName:41d04c65-c8a1-472a-bc74-6b20bec61fbc nodeName:}" failed. No retries permitted until 2025-12-10 12:12:14.946069497 +0000 UTC m=+1221.031594731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift") pod "swift-storage-0" (UID: "41d04c65-c8a1-472a-bc74-6b20bec61fbc") : configmap "swift-ring-files" not found Dec 10 12:12:08 crc kubenswrapper[4852]: I1210 12:12:08.429473 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:12:08 crc kubenswrapper[4852]: I1210 12:12:08.517258 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5zwrg"] Dec 10 12:12:08 crc kubenswrapper[4852]: I1210 12:12:08.517482 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" podUID="1d872505-242e-4adc-acdd-756183702aba" containerName="dnsmasq-dns" containerID="cri-o://ea77f26c94cc33aac957f4c873d4145eb53a779c953ed08b5e911dbdea82a9da" gracePeriod=10 Dec 10 12:12:08 crc kubenswrapper[4852]: W1210 12:12:08.848989 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd008d61b_41dd_4c4d_be4c_d4a1de845bb5.slice/crio-d38d684fa215a3f4d707dfc20bc667d50f5c2d85203bad56ed9da241834f7b1d WatchSource:0}: Error finding container d38d684fa215a3f4d707dfc20bc667d50f5c2d85203bad56ed9da241834f7b1d: Status 404 returned error can't find the container with id d38d684fa215a3f4d707dfc20bc667d50f5c2d85203bad56ed9da241834f7b1d Dec 10 12:12:08 crc kubenswrapper[4852]: I1210 12:12:08.849983 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qfvrl"] Dec 10 12:12:08 crc kubenswrapper[4852]: I1210 12:12:08.867682 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f1e3-account-create-update-kn7lg"] Dec 10 12:12:08 crc kubenswrapper[4852]: I1210 12:12:08.928220 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3a6c-account-create-update-gcs96"] Dec 10 12:12:08 crc kubenswrapper[4852]: I1210 12:12:08.934156 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4cwxn"] Dec 10 12:12:09 crc kubenswrapper[4852]: I1210 12:12:09.591049 4852 generic.go:334] "Generic (PLEG): container finished" podID="1d872505-242e-4adc-acdd-756183702aba" containerID="ea77f26c94cc33aac957f4c873d4145eb53a779c953ed08b5e911dbdea82a9da" exitCode=0 Dec 10 12:12:09 crc kubenswrapper[4852]: I1210 12:12:09.591444 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" event={"ID":"1d872505-242e-4adc-acdd-756183702aba","Type":"ContainerDied","Data":"ea77f26c94cc33aac957f4c873d4145eb53a779c953ed08b5e911dbdea82a9da"} Dec 10 12:12:09 crc kubenswrapper[4852]: I1210 12:12:09.592562 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qfvrl" event={"ID":"d008d61b-41dd-4c4d-be4c-d4a1de845bb5","Type":"ContainerStarted","Data":"d38d684fa215a3f4d707dfc20bc667d50f5c2d85203bad56ed9da241834f7b1d"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.601409 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f1e3-account-create-update-kn7lg" event={"ID":"220ec4be-95ca-4ace-967b-f7bf22c7d11a","Type":"ContainerStarted","Data":"b8e136c591aa74b6312a40d51cf9eefec5f9a5a1c7b03164d17b0242ff78188a"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.601756 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f1e3-account-create-update-kn7lg" event={"ID":"220ec4be-95ca-4ace-967b-f7bf22c7d11a","Type":"ContainerStarted","Data":"fc52c9123cdb14f2c6121b1a05153fd9f4522a68e42708e981f2fa1464b9625b"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.604122 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" event={"ID":"1d872505-242e-4adc-acdd-756183702aba","Type":"ContainerDied","Data":"38393b4663e1b2d1e443dfb0df303312efda09238e0280edda6033bb0dd00ffa"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.604153 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38393b4663e1b2d1e443dfb0df303312efda09238e0280edda6033bb0dd00ffa" Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.605559 4852 generic.go:334] "Generic (PLEG): container finished" podID="b019d2c6-723d-49ac-953b-a5b624876c5c" containerID="bff7c261363fc78a78427b3858cda8b95909a5e3fa7cb79344426b4dcbc773b2" exitCode=0 Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.605606 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cwxn" event={"ID":"b019d2c6-723d-49ac-953b-a5b624876c5c","Type":"ContainerDied","Data":"bff7c261363fc78a78427b3858cda8b95909a5e3fa7cb79344426b4dcbc773b2"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.605626 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cwxn" event={"ID":"b019d2c6-723d-49ac-953b-a5b624876c5c","Type":"ContainerStarted","Data":"529a0afd7dd301d1f2737d7c844a87714cf80a101c42af908562c0af7a5fe047"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.607509 4852 generic.go:334] "Generic (PLEG): container finished" podID="d008d61b-41dd-4c4d-be4c-d4a1de845bb5" containerID="1aa1a099437d331a2f218accc67f02364aacea8c5f8eb2392afe3c73fd40c48f" exitCode=0 Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.607546 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qfvrl" event={"ID":"d008d61b-41dd-4c4d-be4c-d4a1de845bb5","Type":"ContainerDied","Data":"1aa1a099437d331a2f218accc67f02364aacea8c5f8eb2392afe3c73fd40c48f"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.608663 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5w2l8" event={"ID":"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8","Type":"ContainerStarted","Data":"37eb9aad162ac547b62a278e4a628322b9ac57402e34ba8c00c8ec705d31eab8"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.610606 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3a6c-account-create-update-gcs96" event={"ID":"2f8ae564-c6ed-47c8-9952-f18311d280c5","Type":"ContainerStarted","Data":"3b205233875c7a02bfec5a7c14297fe9ef15b31d1f506db45e99af795747ca8f"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.610634 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3a6c-account-create-update-gcs96" event={"ID":"2f8ae564-c6ed-47c8-9952-f18311d280c5","Type":"ContainerStarted","Data":"b65c464f1fda04599f86acc28d4595bdb6effeeb9a52f1b61bda7fbb165733d3"} Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.640024 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f1e3-account-create-update-kn7lg" podStartSLOduration=4.640003157 podStartE2EDuration="4.640003157s" podCreationTimestamp="2025-12-10 12:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:10.623933247 +0000 UTC m=+1216.709458471" watchObservedRunningTime="2025-12-10 12:12:10.640003157 +0000 UTC m=+1216.725528391" Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.655004 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3a6c-account-create-update-gcs96" podStartSLOduration=4.65498558 podStartE2EDuration="4.65498558s" podCreationTimestamp="2025-12-10 12:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:10.651910163 +0000 UTC m=+1216.737435407" watchObservedRunningTime="2025-12-10 12:12:10.65498558 +0000 UTC m=+1216.740510804" Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.688820 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-5w2l8" podStartSLOduration=3.312553653 podStartE2EDuration="7.688804982s" podCreationTimestamp="2025-12-10 12:12:03 +0000 UTC" firstStartedPulling="2025-12-10 12:12:03.969146531 +0000 UTC m=+1210.054671755" lastFinishedPulling="2025-12-10 12:12:08.34539786 +0000 UTC m=+1214.430923084" observedRunningTime="2025-12-10 12:12:10.68750923 +0000 UTC m=+1216.773034464" watchObservedRunningTime="2025-12-10 12:12:10.688804982 +0000 UTC m=+1216.774330206" Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.879531 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.922337 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-dns-svc\") pod \"1d872505-242e-4adc-acdd-756183702aba\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.922469 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpf6b\" (UniqueName: \"kubernetes.io/projected/1d872505-242e-4adc-acdd-756183702aba-kube-api-access-dpf6b\") pod \"1d872505-242e-4adc-acdd-756183702aba\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.922542 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-config\") pod \"1d872505-242e-4adc-acdd-756183702aba\" (UID: \"1d872505-242e-4adc-acdd-756183702aba\") " Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.928881 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d872505-242e-4adc-acdd-756183702aba-kube-api-access-dpf6b" (OuterVolumeSpecName: "kube-api-access-dpf6b") pod "1d872505-242e-4adc-acdd-756183702aba" (UID: "1d872505-242e-4adc-acdd-756183702aba"). InnerVolumeSpecName "kube-api-access-dpf6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.983504 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-config" (OuterVolumeSpecName: "config") pod "1d872505-242e-4adc-acdd-756183702aba" (UID: "1d872505-242e-4adc-acdd-756183702aba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:10 crc kubenswrapper[4852]: I1210 12:12:10.988984 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d872505-242e-4adc-acdd-756183702aba" (UID: "1d872505-242e-4adc-acdd-756183702aba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.024937 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.025002 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpf6b\" (UniqueName: \"kubernetes.io/projected/1d872505-242e-4adc-acdd-756183702aba-kube-api-access-dpf6b\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.025013 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d872505-242e-4adc-acdd-756183702aba-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.505780 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hlx47"] Dec 10 12:12:11 crc kubenswrapper[4852]: E1210 12:12:11.506485 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d872505-242e-4adc-acdd-756183702aba" containerName="init" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.506550 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d872505-242e-4adc-acdd-756183702aba" containerName="init" Dec 10 12:12:11 crc kubenswrapper[4852]: E1210 12:12:11.506584 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d872505-242e-4adc-acdd-756183702aba" containerName="dnsmasq-dns" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.506594 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d872505-242e-4adc-acdd-756183702aba" containerName="dnsmasq-dns" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.509218 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d872505-242e-4adc-acdd-756183702aba" containerName="dnsmasq-dns" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.509869 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hlx47" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.513157 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hlx47"] Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.616896 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-aa09-account-create-update-2h8ln"] Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.632965 4852 generic.go:334] "Generic (PLEG): container finished" podID="220ec4be-95ca-4ace-967b-f7bf22c7d11a" containerID="b8e136c591aa74b6312a40d51cf9eefec5f9a5a1c7b03164d17b0242ff78188a" exitCode=0 Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.636721 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6vk2\" (UniqueName: \"kubernetes.io/projected/dcb9c92f-a03f-43e3-8f43-336e4236feee-kube-api-access-n6vk2\") pod \"glance-db-create-hlx47\" (UID: \"dcb9c92f-a03f-43e3-8f43-336e4236feee\") " pod="openstack/glance-db-create-hlx47" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.636745 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f1e3-account-create-update-kn7lg" event={"ID":"220ec4be-95ca-4ace-967b-f7bf22c7d11a","Type":"ContainerDied","Data":"b8e136c591aa74b6312a40d51cf9eefec5f9a5a1c7b03164d17b0242ff78188a"} Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.636906 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.636886 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcb9c92f-a03f-43e3-8f43-336e4236feee-operator-scripts\") pod \"glance-db-create-hlx47\" (UID: \"dcb9c92f-a03f-43e3-8f43-336e4236feee\") " pod="openstack/glance-db-create-hlx47" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.638847 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.642017 4852 generic.go:334] "Generic (PLEG): container finished" podID="2f8ae564-c6ed-47c8-9952-f18311d280c5" containerID="3b205233875c7a02bfec5a7c14297fe9ef15b31d1f506db45e99af795747ca8f" exitCode=0 Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.642284 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3a6c-account-create-update-gcs96" event={"ID":"2f8ae564-c6ed-47c8-9952-f18311d280c5","Type":"ContainerDied","Data":"3b205233875c7a02bfec5a7c14297fe9ef15b31d1f506db45e99af795747ca8f"} Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.642333 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-aa09-account-create-update-2h8ln"] Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.642556 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5zwrg" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.738600 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316e219-f771-4b26-9329-4e110779b164-operator-scripts\") pod \"glance-aa09-account-create-update-2h8ln\" (UID: \"1316e219-f771-4b26-9329-4e110779b164\") " pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.738676 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6vk2\" (UniqueName: \"kubernetes.io/projected/dcb9c92f-a03f-43e3-8f43-336e4236feee-kube-api-access-n6vk2\") pod \"glance-db-create-hlx47\" (UID: \"dcb9c92f-a03f-43e3-8f43-336e4236feee\") " pod="openstack/glance-db-create-hlx47" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.738725 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcb9c92f-a03f-43e3-8f43-336e4236feee-operator-scripts\") pod \"glance-db-create-hlx47\" (UID: \"dcb9c92f-a03f-43e3-8f43-336e4236feee\") " pod="openstack/glance-db-create-hlx47" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.738787 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw27f\" (UniqueName: \"kubernetes.io/projected/1316e219-f771-4b26-9329-4e110779b164-kube-api-access-kw27f\") pod \"glance-aa09-account-create-update-2h8ln\" (UID: \"1316e219-f771-4b26-9329-4e110779b164\") " pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.740108 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5zwrg"] Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.740692 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcb9c92f-a03f-43e3-8f43-336e4236feee-operator-scripts\") pod \"glance-db-create-hlx47\" (UID: \"dcb9c92f-a03f-43e3-8f43-336e4236feee\") " pod="openstack/glance-db-create-hlx47" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.747776 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5zwrg"] Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.760731 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6vk2\" (UniqueName: \"kubernetes.io/projected/dcb9c92f-a03f-43e3-8f43-336e4236feee-kube-api-access-n6vk2\") pod \"glance-db-create-hlx47\" (UID: \"dcb9c92f-a03f-43e3-8f43-336e4236feee\") " pod="openstack/glance-db-create-hlx47" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.830792 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hlx47" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.840251 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316e219-f771-4b26-9329-4e110779b164-operator-scripts\") pod \"glance-aa09-account-create-update-2h8ln\" (UID: \"1316e219-f771-4b26-9329-4e110779b164\") " pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.840468 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw27f\" (UniqueName: \"kubernetes.io/projected/1316e219-f771-4b26-9329-4e110779b164-kube-api-access-kw27f\") pod \"glance-aa09-account-create-update-2h8ln\" (UID: \"1316e219-f771-4b26-9329-4e110779b164\") " pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.841363 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316e219-f771-4b26-9329-4e110779b164-operator-scripts\") pod \"glance-aa09-account-create-update-2h8ln\" (UID: \"1316e219-f771-4b26-9329-4e110779b164\") " pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:11 crc kubenswrapper[4852]: I1210 12:12:11.872031 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw27f\" (UniqueName: \"kubernetes.io/projected/1316e219-f771-4b26-9329-4e110779b164-kube-api-access-kw27f\") pod \"glance-aa09-account-create-update-2h8ln\" (UID: \"1316e219-f771-4b26-9329-4e110779b164\") " pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.006534 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.091226 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.108081 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.147572 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b019d2c6-723d-49ac-953b-a5b624876c5c-operator-scripts\") pod \"b019d2c6-723d-49ac-953b-a5b624876c5c\" (UID: \"b019d2c6-723d-49ac-953b-a5b624876c5c\") " Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.147632 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-operator-scripts\") pod \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\" (UID: \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\") " Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.147658 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2ldl\" (UniqueName: \"kubernetes.io/projected/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-kube-api-access-t2ldl\") pod \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\" (UID: \"d008d61b-41dd-4c4d-be4c-d4a1de845bb5\") " Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.147689 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5s4r\" (UniqueName: \"kubernetes.io/projected/b019d2c6-723d-49ac-953b-a5b624876c5c-kube-api-access-b5s4r\") pod \"b019d2c6-723d-49ac-953b-a5b624876c5c\" (UID: \"b019d2c6-723d-49ac-953b-a5b624876c5c\") " Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.148109 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d008d61b-41dd-4c4d-be4c-d4a1de845bb5" (UID: "d008d61b-41dd-4c4d-be4c-d4a1de845bb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.148155 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b019d2c6-723d-49ac-953b-a5b624876c5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b019d2c6-723d-49ac-953b-a5b624876c5c" (UID: "b019d2c6-723d-49ac-953b-a5b624876c5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.151949 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b019d2c6-723d-49ac-953b-a5b624876c5c-kube-api-access-b5s4r" (OuterVolumeSpecName: "kube-api-access-b5s4r") pod "b019d2c6-723d-49ac-953b-a5b624876c5c" (UID: "b019d2c6-723d-49ac-953b-a5b624876c5c"). InnerVolumeSpecName "kube-api-access-b5s4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.153148 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-kube-api-access-t2ldl" (OuterVolumeSpecName: "kube-api-access-t2ldl") pod "d008d61b-41dd-4c4d-be4c-d4a1de845bb5" (UID: "d008d61b-41dd-4c4d-be4c-d4a1de845bb5"). InnerVolumeSpecName "kube-api-access-t2ldl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.187704 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d872505-242e-4adc-acdd-756183702aba" path="/var/lib/kubelet/pods/1d872505-242e-4adc-acdd-756183702aba/volumes" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.250139 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b019d2c6-723d-49ac-953b-a5b624876c5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.250178 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.250188 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2ldl\" (UniqueName: \"kubernetes.io/projected/d008d61b-41dd-4c4d-be4c-d4a1de845bb5-kube-api-access-t2ldl\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.250197 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5s4r\" (UniqueName: \"kubernetes.io/projected/b019d2c6-723d-49ac-953b-a5b624876c5c-kube-api-access-b5s4r\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.306937 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hlx47"] Dec 10 12:12:12 crc kubenswrapper[4852]: W1210 12:12:12.310666 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcb9c92f_a03f_43e3_8f43_336e4236feee.slice/crio-ea50ab318d3493b2812359870b57c4973c9628e11970180bbe5bf29fc8825b61 WatchSource:0}: Error finding container ea50ab318d3493b2812359870b57c4973c9628e11970180bbe5bf29fc8825b61: Status 404 returned error can't find the container with id ea50ab318d3493b2812359870b57c4973c9628e11970180bbe5bf29fc8825b61 Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.457458 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-aa09-account-create-update-2h8ln"] Dec 10 12:12:12 crc kubenswrapper[4852]: W1210 12:12:12.461109 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1316e219_f771_4b26_9329_4e110779b164.slice/crio-5334272be42e1bd99f0580c003c755ae86b1f5db8f9503deb3a68b303551869a WatchSource:0}: Error finding container 5334272be42e1bd99f0580c003c755ae86b1f5db8f9503deb3a68b303551869a: Status 404 returned error can't find the container with id 5334272be42e1bd99f0580c003c755ae86b1f5db8f9503deb3a68b303551869a Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.657567 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cwxn" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.657580 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cwxn" event={"ID":"b019d2c6-723d-49ac-953b-a5b624876c5c","Type":"ContainerDied","Data":"529a0afd7dd301d1f2737d7c844a87714cf80a101c42af908562c0af7a5fe047"} Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.658855 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="529a0afd7dd301d1f2737d7c844a87714cf80a101c42af908562c0af7a5fe047" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.659346 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qfvrl" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.659313 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qfvrl" event={"ID":"d008d61b-41dd-4c4d-be4c-d4a1de845bb5","Type":"ContainerDied","Data":"d38d684fa215a3f4d707dfc20bc667d50f5c2d85203bad56ed9da241834f7b1d"} Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.659601 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d38d684fa215a3f4d707dfc20bc667d50f5c2d85203bad56ed9da241834f7b1d" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.661943 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-aa09-account-create-update-2h8ln" event={"ID":"1316e219-f771-4b26-9329-4e110779b164","Type":"ContainerStarted","Data":"b168317c146d94963b81d71b26279cae19d0b1d9039b615019aa13c7cf02241a"} Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.661975 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-aa09-account-create-update-2h8ln" event={"ID":"1316e219-f771-4b26-9329-4e110779b164","Type":"ContainerStarted","Data":"5334272be42e1bd99f0580c003c755ae86b1f5db8f9503deb3a68b303551869a"} Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.664004 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hlx47" event={"ID":"dcb9c92f-a03f-43e3-8f43-336e4236feee","Type":"ContainerStarted","Data":"a2dabaea4bb210129f40509462e06fb569608a9546eec76af81547c84af67341"} Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.664070 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hlx47" event={"ID":"dcb9c92f-a03f-43e3-8f43-336e4236feee","Type":"ContainerStarted","Data":"ea50ab318d3493b2812359870b57c4973c9628e11970180bbe5bf29fc8825b61"} Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.689598 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-aa09-account-create-update-2h8ln" podStartSLOduration=1.689578091 podStartE2EDuration="1.689578091s" podCreationTimestamp="2025-12-10 12:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:12.684793692 +0000 UTC m=+1218.770318916" watchObservedRunningTime="2025-12-10 12:12:12.689578091 +0000 UTC m=+1218.775103315" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.965395 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:12 crc kubenswrapper[4852]: I1210 12:12:12.979693 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-hlx47" podStartSLOduration=1.979673534 podStartE2EDuration="1.979673534s" podCreationTimestamp="2025-12-10 12:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:12.70882763 +0000 UTC m=+1218.794352854" watchObservedRunningTime="2025-12-10 12:12:12.979673534 +0000 UTC m=+1219.065198758" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.077611 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8stps\" (UniqueName: \"kubernetes.io/projected/220ec4be-95ca-4ace-967b-f7bf22c7d11a-kube-api-access-8stps\") pod \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\" (UID: \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\") " Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.077863 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220ec4be-95ca-4ace-967b-f7bf22c7d11a-operator-scripts\") pod \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\" (UID: \"220ec4be-95ca-4ace-967b-f7bf22c7d11a\") " Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.078852 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220ec4be-95ca-4ace-967b-f7bf22c7d11a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "220ec4be-95ca-4ace-967b-f7bf22c7d11a" (UID: "220ec4be-95ca-4ace-967b-f7bf22c7d11a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.082904 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/220ec4be-95ca-4ace-967b-f7bf22c7d11a-kube-api-access-8stps" (OuterVolumeSpecName: "kube-api-access-8stps") pod "220ec4be-95ca-4ace-967b-f7bf22c7d11a" (UID: "220ec4be-95ca-4ace-967b-f7bf22c7d11a"). InnerVolumeSpecName "kube-api-access-8stps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.089431 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.179506 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8ae564-c6ed-47c8-9952-f18311d280c5-operator-scripts\") pod \"2f8ae564-c6ed-47c8-9952-f18311d280c5\" (UID: \"2f8ae564-c6ed-47c8-9952-f18311d280c5\") " Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.179596 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hh8k\" (UniqueName: \"kubernetes.io/projected/2f8ae564-c6ed-47c8-9952-f18311d280c5-kube-api-access-5hh8k\") pod \"2f8ae564-c6ed-47c8-9952-f18311d280c5\" (UID: \"2f8ae564-c6ed-47c8-9952-f18311d280c5\") " Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.179998 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220ec4be-95ca-4ace-967b-f7bf22c7d11a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.180015 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8stps\" (UniqueName: \"kubernetes.io/projected/220ec4be-95ca-4ace-967b-f7bf22c7d11a-kube-api-access-8stps\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.180149 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f8ae564-c6ed-47c8-9952-f18311d280c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f8ae564-c6ed-47c8-9952-f18311d280c5" (UID: "2f8ae564-c6ed-47c8-9952-f18311d280c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.185218 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8ae564-c6ed-47c8-9952-f18311d280c5-kube-api-access-5hh8k" (OuterVolumeSpecName: "kube-api-access-5hh8k") pod "2f8ae564-c6ed-47c8-9952-f18311d280c5" (UID: "2f8ae564-c6ed-47c8-9952-f18311d280c5"). InnerVolumeSpecName "kube-api-access-5hh8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.281638 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hh8k\" (UniqueName: \"kubernetes.io/projected/2f8ae564-c6ed-47c8-9952-f18311d280c5-kube-api-access-5hh8k\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.281676 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f8ae564-c6ed-47c8-9952-f18311d280c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.673573 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f1e3-account-create-update-kn7lg" event={"ID":"220ec4be-95ca-4ace-967b-f7bf22c7d11a","Type":"ContainerDied","Data":"fc52c9123cdb14f2c6121b1a05153fd9f4522a68e42708e981f2fa1464b9625b"} Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.673624 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc52c9123cdb14f2c6121b1a05153fd9f4522a68e42708e981f2fa1464b9625b" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.673622 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f1e3-account-create-update-kn7lg" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.676037 4852 generic.go:334] "Generic (PLEG): container finished" podID="1316e219-f771-4b26-9329-4e110779b164" containerID="b168317c146d94963b81d71b26279cae19d0b1d9039b615019aa13c7cf02241a" exitCode=0 Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.676109 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-aa09-account-create-update-2h8ln" event={"ID":"1316e219-f771-4b26-9329-4e110779b164","Type":"ContainerDied","Data":"b168317c146d94963b81d71b26279cae19d0b1d9039b615019aa13c7cf02241a"} Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.681895 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3a6c-account-create-update-gcs96" event={"ID":"2f8ae564-c6ed-47c8-9952-f18311d280c5","Type":"ContainerDied","Data":"b65c464f1fda04599f86acc28d4595bdb6effeeb9a52f1b61bda7fbb165733d3"} Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.681940 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b65c464f1fda04599f86acc28d4595bdb6effeeb9a52f1b61bda7fbb165733d3" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.681942 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3a6c-account-create-update-gcs96" Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.683842 4852 generic.go:334] "Generic (PLEG): container finished" podID="dcb9c92f-a03f-43e3-8f43-336e4236feee" containerID="a2dabaea4bb210129f40509462e06fb569608a9546eec76af81547c84af67341" exitCode=0 Dec 10 12:12:13 crc kubenswrapper[4852]: I1210 12:12:13.683883 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hlx47" event={"ID":"dcb9c92f-a03f-43e3-8f43-336e4236feee","Type":"ContainerDied","Data":"a2dabaea4bb210129f40509462e06fb569608a9546eec76af81547c84af67341"} Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.014979 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:12:15 crc kubenswrapper[4852]: E1210 12:12:15.015190 4852 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 10 12:12:15 crc kubenswrapper[4852]: E1210 12:12:15.015565 4852 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 10 12:12:15 crc kubenswrapper[4852]: E1210 12:12:15.015630 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift podName:41d04c65-c8a1-472a-bc74-6b20bec61fbc nodeName:}" failed. No retries permitted until 2025-12-10 12:12:31.015609049 +0000 UTC m=+1237.101134273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift") pod "swift-storage-0" (UID: "41d04c65-c8a1-472a-bc74-6b20bec61fbc") : configmap "swift-ring-files" not found Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.129555 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hlx47" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.134881 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.218003 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw27f\" (UniqueName: \"kubernetes.io/projected/1316e219-f771-4b26-9329-4e110779b164-kube-api-access-kw27f\") pod \"1316e219-f771-4b26-9329-4e110779b164\" (UID: \"1316e219-f771-4b26-9329-4e110779b164\") " Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.218064 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6vk2\" (UniqueName: \"kubernetes.io/projected/dcb9c92f-a03f-43e3-8f43-336e4236feee-kube-api-access-n6vk2\") pod \"dcb9c92f-a03f-43e3-8f43-336e4236feee\" (UID: \"dcb9c92f-a03f-43e3-8f43-336e4236feee\") " Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.218082 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316e219-f771-4b26-9329-4e110779b164-operator-scripts\") pod \"1316e219-f771-4b26-9329-4e110779b164\" (UID: \"1316e219-f771-4b26-9329-4e110779b164\") " Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.218121 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcb9c92f-a03f-43e3-8f43-336e4236feee-operator-scripts\") pod \"dcb9c92f-a03f-43e3-8f43-336e4236feee\" (UID: \"dcb9c92f-a03f-43e3-8f43-336e4236feee\") " Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.218989 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb9c92f-a03f-43e3-8f43-336e4236feee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcb9c92f-a03f-43e3-8f43-336e4236feee" (UID: "dcb9c92f-a03f-43e3-8f43-336e4236feee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.219084 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1316e219-f771-4b26-9329-4e110779b164-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1316e219-f771-4b26-9329-4e110779b164" (UID: "1316e219-f771-4b26-9329-4e110779b164"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.220160 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.223150 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1316e219-f771-4b26-9329-4e110779b164-kube-api-access-kw27f" (OuterVolumeSpecName: "kube-api-access-kw27f") pod "1316e219-f771-4b26-9329-4e110779b164" (UID: "1316e219-f771-4b26-9329-4e110779b164"). InnerVolumeSpecName "kube-api-access-kw27f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.223943 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb9c92f-a03f-43e3-8f43-336e4236feee-kube-api-access-n6vk2" (OuterVolumeSpecName: "kube-api-access-n6vk2") pod "dcb9c92f-a03f-43e3-8f43-336e4236feee" (UID: "dcb9c92f-a03f-43e3-8f43-336e4236feee"). InnerVolumeSpecName "kube-api-access-n6vk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.320810 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw27f\" (UniqueName: \"kubernetes.io/projected/1316e219-f771-4b26-9329-4e110779b164-kube-api-access-kw27f\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.320856 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6vk2\" (UniqueName: \"kubernetes.io/projected/dcb9c92f-a03f-43e3-8f43-336e4236feee-kube-api-access-n6vk2\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.320872 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316e219-f771-4b26-9329-4e110779b164-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.320883 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcb9c92f-a03f-43e3-8f43-336e4236feee-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.702570 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hlx47" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.702654 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hlx47" event={"ID":"dcb9c92f-a03f-43e3-8f43-336e4236feee","Type":"ContainerDied","Data":"ea50ab318d3493b2812359870b57c4973c9628e11970180bbe5bf29fc8825b61"} Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.702718 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea50ab318d3493b2812359870b57c4973c9628e11970180bbe5bf29fc8825b61" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.704992 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-aa09-account-create-update-2h8ln" event={"ID":"1316e219-f771-4b26-9329-4e110779b164","Type":"ContainerDied","Data":"5334272be42e1bd99f0580c003c755ae86b1f5db8f9503deb3a68b303551869a"} Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.705030 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5334272be42e1bd99f0580c003c755ae86b1f5db8f9503deb3a68b303551869a" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.705071 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-aa09-account-create-update-2h8ln" Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.790196 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:12:15 crc kubenswrapper[4852]: I1210 12:12:15.790275 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.847515 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gkhdx" podUID="6246b317-7d73-49ff-bd8e-f4862a4584c6" containerName="ovn-controller" probeResult="failure" output=< Dec 10 12:12:16 crc kubenswrapper[4852]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 10 12:12:16 crc kubenswrapper[4852]: > Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.890927 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8f6wn"] Dec 10 12:12:16 crc kubenswrapper[4852]: E1210 12:12:16.891366 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b019d2c6-723d-49ac-953b-a5b624876c5c" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891388 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b019d2c6-723d-49ac-953b-a5b624876c5c" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: E1210 12:12:16.891405 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1316e219-f771-4b26-9329-4e110779b164" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891414 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1316e219-f771-4b26-9329-4e110779b164" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: E1210 12:12:16.891431 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb9c92f-a03f-43e3-8f43-336e4236feee" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891439 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb9c92f-a03f-43e3-8f43-336e4236feee" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: E1210 12:12:16.891451 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="220ec4be-95ca-4ace-967b-f7bf22c7d11a" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891458 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="220ec4be-95ca-4ace-967b-f7bf22c7d11a" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: E1210 12:12:16.891481 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d008d61b-41dd-4c4d-be4c-d4a1de845bb5" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891489 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d008d61b-41dd-4c4d-be4c-d4a1de845bb5" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: E1210 12:12:16.891500 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8ae564-c6ed-47c8-9952-f18311d280c5" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891507 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8ae564-c6ed-47c8-9952-f18311d280c5" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891683 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="220ec4be-95ca-4ace-967b-f7bf22c7d11a" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891704 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8ae564-c6ed-47c8-9952-f18311d280c5" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891717 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d008d61b-41dd-4c4d-be4c-d4a1de845bb5" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891731 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b019d2c6-723d-49ac-953b-a5b624876c5c" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891747 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb9c92f-a03f-43e3-8f43-336e4236feee" containerName="mariadb-database-create" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.891756 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="1316e219-f771-4b26-9329-4e110779b164" containerName="mariadb-account-create-update" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.892380 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.895348 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.895578 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-54ckn" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.899991 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8f6wn"] Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.945884 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-db-sync-config-data\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.946013 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklwk\" (UniqueName: \"kubernetes.io/projected/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-kube-api-access-qklwk\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.946040 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-config-data\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:16 crc kubenswrapper[4852]: I1210 12:12:16.946094 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-combined-ca-bundle\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.047600 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklwk\" (UniqueName: \"kubernetes.io/projected/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-kube-api-access-qklwk\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.047650 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-config-data\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.047703 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-combined-ca-bundle\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.047746 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-db-sync-config-data\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.053772 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-config-data\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.053879 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-combined-ca-bundle\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.058087 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-db-sync-config-data\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.065921 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklwk\" (UniqueName: \"kubernetes.io/projected/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-kube-api-access-qklwk\") pod \"glance-db-sync-8f6wn\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.217875 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.722523 4852 generic.go:334] "Generic (PLEG): container finished" podID="15a1ed1e-209b-4c71-b15f-44caaec70e93" containerID="021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27" exitCode=0 Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.722645 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15a1ed1e-209b-4c71-b15f-44caaec70e93","Type":"ContainerDied","Data":"021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27"} Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.725305 4852 generic.go:334] "Generic (PLEG): container finished" podID="09a9edae-3cd0-4f71-ba18-9800a7baefef" containerID="18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d" exitCode=0 Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.725356 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09a9edae-3cd0-4f71-ba18-9800a7baefef","Type":"ContainerDied","Data":"18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d"} Dec 10 12:12:17 crc kubenswrapper[4852]: I1210 12:12:17.890467 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8f6wn"] Dec 10 12:12:18 crc kubenswrapper[4852]: I1210 12:12:18.736244 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8f6wn" event={"ID":"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f","Type":"ContainerStarted","Data":"5c4255b93e988e144ca25a6681a29666d4c954547462176c3e2c129943852094"} Dec 10 12:12:18 crc kubenswrapper[4852]: I1210 12:12:18.739644 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15a1ed1e-209b-4c71-b15f-44caaec70e93","Type":"ContainerStarted","Data":"c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0"} Dec 10 12:12:18 crc kubenswrapper[4852]: I1210 12:12:18.739922 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:12:18 crc kubenswrapper[4852]: I1210 12:12:18.742869 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09a9edae-3cd0-4f71-ba18-9800a7baefef","Type":"ContainerStarted","Data":"09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac"} Dec 10 12:12:18 crc kubenswrapper[4852]: I1210 12:12:18.743495 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 10 12:12:18 crc kubenswrapper[4852]: I1210 12:12:18.765993 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.797431224 podStartE2EDuration="1m7.765971483s" podCreationTimestamp="2025-12-10 12:11:11 +0000 UTC" firstStartedPulling="2025-12-10 12:11:30.761639566 +0000 UTC m=+1176.847164800" lastFinishedPulling="2025-12-10 12:11:42.730179835 +0000 UTC m=+1188.815705059" observedRunningTime="2025-12-10 12:12:18.764934377 +0000 UTC m=+1224.850459601" watchObservedRunningTime="2025-12-10 12:12:18.765971483 +0000 UTC m=+1224.851496717" Dec 10 12:12:18 crc kubenswrapper[4852]: I1210 12:12:18.790600 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=57.113400201 podStartE2EDuration="1m7.790579115s" podCreationTimestamp="2025-12-10 12:11:11 +0000 UTC" firstStartedPulling="2025-12-10 12:11:30.754382735 +0000 UTC m=+1176.839907959" lastFinishedPulling="2025-12-10 12:11:41.431561649 +0000 UTC m=+1187.517086873" observedRunningTime="2025-12-10 12:12:18.789318444 +0000 UTC m=+1224.874843678" watchObservedRunningTime="2025-12-10 12:12:18.790579115 +0000 UTC m=+1224.876104339" Dec 10 12:12:19 crc kubenswrapper[4852]: I1210 12:12:19.755089 4852 generic.go:334] "Generic (PLEG): container finished" podID="cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" containerID="37eb9aad162ac547b62a278e4a628322b9ac57402e34ba8c00c8ec705d31eab8" exitCode=0 Dec 10 12:12:19 crc kubenswrapper[4852]: I1210 12:12:19.755445 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5w2l8" event={"ID":"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8","Type":"ContainerDied","Data":"37eb9aad162ac547b62a278e4a628322b9ac57402e34ba8c00c8ec705d31eab8"} Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.099266 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.236751 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-etc-swift\") pod \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.236820 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-combined-ca-bundle\") pod \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.237486 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-swiftconf\") pod \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.237516 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-dispersionconf\") pod \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.237631 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xj5b\" (UniqueName: \"kubernetes.io/projected/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-kube-api-access-5xj5b\") pod \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.237705 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-scripts\") pod \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.237749 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-ring-data-devices\") pod \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\" (UID: \"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8\") " Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.238424 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" (UID: "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.239117 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" (UID: "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.241825 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-kube-api-access-5xj5b" (OuterVolumeSpecName: "kube-api-access-5xj5b") pod "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" (UID: "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8"). InnerVolumeSpecName "kube-api-access-5xj5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.259867 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" (UID: "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.260355 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-scripts" (OuterVolumeSpecName: "scripts") pod "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" (UID: "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.262875 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" (UID: "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.267436 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" (UID: "cce2fc32-02ab-4099-ac2f-c0eeca72f9a8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.339610 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.339641 4852 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.339653 4852 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.339662 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.339671 4852 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.339680 4852 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.339689 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xj5b\" (UniqueName: \"kubernetes.io/projected/cce2fc32-02ab-4099-ac2f-c0eeca72f9a8-kube-api-access-5xj5b\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.770148 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5w2l8" event={"ID":"cce2fc32-02ab-4099-ac2f-c0eeca72f9a8","Type":"ContainerDied","Data":"bf7d9a1da40f0c53a26a93fa9df3d8fb302e58b6dac9ecc9e6934d7ba26ef583"} Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.770820 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf7d9a1da40f0c53a26a93fa9df3d8fb302e58b6dac9ecc9e6934d7ba26ef583" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.770396 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5w2l8" Dec 10 12:12:21 crc kubenswrapper[4852]: I1210 12:12:21.848702 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gkhdx" podUID="6246b317-7d73-49ff-bd8e-f4862a4584c6" containerName="ovn-controller" probeResult="failure" output=< Dec 10 12:12:21 crc kubenswrapper[4852]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 10 12:12:21 crc kubenswrapper[4852]: > Dec 10 12:12:26 crc kubenswrapper[4852]: I1210 12:12:26.848707 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gkhdx" podUID="6246b317-7d73-49ff-bd8e-f4862a4584c6" containerName="ovn-controller" probeResult="failure" output=< Dec 10 12:12:26 crc kubenswrapper[4852]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 10 12:12:26 crc kubenswrapper[4852]: > Dec 10 12:12:26 crc kubenswrapper[4852]: I1210 12:12:26.881570 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:12:26 crc kubenswrapper[4852]: I1210 12:12:26.894875 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qd68p" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.110132 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gkhdx-config-5d7gg"] Dec 10 12:12:27 crc kubenswrapper[4852]: E1210 12:12:27.110563 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" containerName="swift-ring-rebalance" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.110588 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" containerName="swift-ring-rebalance" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.111119 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce2fc32-02ab-4099-ac2f-c0eeca72f9a8" containerName="swift-ring-rebalance" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.111787 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.115507 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.137355 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gkhdx-config-5d7gg"] Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.256294 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-scripts\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.256366 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run-ovn\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.256407 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-additional-scripts\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.256483 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-log-ovn\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.256509 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljw96\" (UniqueName: \"kubernetes.io/projected/7336a6ff-6a4f-4b98-be01-275df300b1ba-kube-api-access-ljw96\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.256534 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.358184 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-scripts\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.358250 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run-ovn\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.358295 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-additional-scripts\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.358391 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-log-ovn\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.358422 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljw96\" (UniqueName: \"kubernetes.io/projected/7336a6ff-6a4f-4b98-be01-275df300b1ba-kube-api-access-ljw96\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.358447 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.358589 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-log-ovn\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.358771 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.359204 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-additional-scripts\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.359326 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run-ovn\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.365639 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-scripts\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.381090 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljw96\" (UniqueName: \"kubernetes.io/projected/7336a6ff-6a4f-4b98-be01-275df300b1ba-kube-api-access-ljw96\") pod \"ovn-controller-gkhdx-config-5d7gg\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:27 crc kubenswrapper[4852]: I1210 12:12:27.436632 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:31 crc kubenswrapper[4852]: I1210 12:12:31.028060 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:12:31 crc kubenswrapper[4852]: I1210 12:12:31.047523 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41d04c65-c8a1-472a-bc74-6b20bec61fbc-etc-swift\") pod \"swift-storage-0\" (UID: \"41d04c65-c8a1-472a-bc74-6b20bec61fbc\") " pod="openstack/swift-storage-0" Dec 10 12:12:31 crc kubenswrapper[4852]: I1210 12:12:31.278527 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 10 12:12:31 crc kubenswrapper[4852]: I1210 12:12:31.858779 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gkhdx" podUID="6246b317-7d73-49ff-bd8e-f4862a4584c6" containerName="ovn-controller" probeResult="failure" output=< Dec 10 12:12:31 crc kubenswrapper[4852]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 10 12:12:31 crc kubenswrapper[4852]: > Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.253171 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gkhdx-config-5d7gg"] Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.348741 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 10 12:12:32 crc kubenswrapper[4852]: W1210 12:12:32.356054 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41d04c65_c8a1_472a_bc74_6b20bec61fbc.slice/crio-d5de7d9b3ea3dadfe45a06a42be392c0b5f6d9cfcbffbb105e5959a447cf3ba9 WatchSource:0}: Error finding container d5de7d9b3ea3dadfe45a06a42be392c0b5f6d9cfcbffbb105e5959a447cf3ba9: Status 404 returned error can't find the container with id d5de7d9b3ea3dadfe45a06a42be392c0b5f6d9cfcbffbb105e5959a447cf3ba9 Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.780432 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.884946 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8f6wn" event={"ID":"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f","Type":"ContainerStarted","Data":"71815324b55fbf56089a5d2f7f0ba6c1662438abf686b0a3e3c13f263a05652b"} Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.891368 4852 generic.go:334] "Generic (PLEG): container finished" podID="7336a6ff-6a4f-4b98-be01-275df300b1ba" containerID="5a3b36140f09aca820be9e1a35063ca5d596d2e26409271f171f8e0acf5346da" exitCode=0 Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.891444 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gkhdx-config-5d7gg" event={"ID":"7336a6ff-6a4f-4b98-be01-275df300b1ba","Type":"ContainerDied","Data":"5a3b36140f09aca820be9e1a35063ca5d596d2e26409271f171f8e0acf5346da"} Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.891508 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gkhdx-config-5d7gg" event={"ID":"7336a6ff-6a4f-4b98-be01-275df300b1ba","Type":"ContainerStarted","Data":"6415f17a8dee1419dcfc1b380ef9a5e8ab46000bcd2e6fb308192f17af8ef892"} Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.893055 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"d5de7d9b3ea3dadfe45a06a42be392c0b5f6d9cfcbffbb105e5959a447cf3ba9"} Dec 10 12:12:32 crc kubenswrapper[4852]: I1210 12:12:32.904148 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8f6wn" podStartSLOduration=2.94784163 podStartE2EDuration="16.904126708s" podCreationTimestamp="2025-12-10 12:12:16 +0000 UTC" firstStartedPulling="2025-12-10 12:12:17.931186687 +0000 UTC m=+1224.016711911" lastFinishedPulling="2025-12-10 12:12:31.887471765 +0000 UTC m=+1237.972996989" observedRunningTime="2025-12-10 12:12:32.900472337 +0000 UTC m=+1238.985997561" watchObservedRunningTime="2025-12-10 12:12:32.904126708 +0000 UTC m=+1238.989651942" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.084464 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.294390 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-vf7gw"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.295817 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.310008 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a82e-account-create-update-267jv"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.311348 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.329491 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.345238 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vf7gw"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.354950 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fvvwn"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.356044 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.381320 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a82e-account-create-update-267jv"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.386294 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90efc41b-4c77-4a6b-bd78-544fd72b4078-operator-scripts\") pod \"cinder-db-create-vf7gw\" (UID: \"90efc41b-4c77-4a6b-bd78-544fd72b4078\") " pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.386339 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvz85\" (UniqueName: \"kubernetes.io/projected/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-kube-api-access-hvz85\") pod \"cinder-a82e-account-create-update-267jv\" (UID: \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\") " pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.386397 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm7vz\" (UniqueName: \"kubernetes.io/projected/90efc41b-4c77-4a6b-bd78-544fd72b4078-kube-api-access-gm7vz\") pod \"cinder-db-create-vf7gw\" (UID: \"90efc41b-4c77-4a6b-bd78-544fd72b4078\") " pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.386427 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-operator-scripts\") pod \"cinder-a82e-account-create-update-267jv\" (UID: \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\") " pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.412867 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fvvwn"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.489144 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbrhz\" (UniqueName: \"kubernetes.io/projected/d4b506fd-90e8-4807-877a-ebb8ec15f09f-kube-api-access-pbrhz\") pod \"barbican-db-create-fvvwn\" (UID: \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\") " pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.489198 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm7vz\" (UniqueName: \"kubernetes.io/projected/90efc41b-4c77-4a6b-bd78-544fd72b4078-kube-api-access-gm7vz\") pod \"cinder-db-create-vf7gw\" (UID: \"90efc41b-4c77-4a6b-bd78-544fd72b4078\") " pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.489248 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-operator-scripts\") pod \"cinder-a82e-account-create-update-267jv\" (UID: \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\") " pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.489329 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90efc41b-4c77-4a6b-bd78-544fd72b4078-operator-scripts\") pod \"cinder-db-create-vf7gw\" (UID: \"90efc41b-4c77-4a6b-bd78-544fd72b4078\") " pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.489347 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvz85\" (UniqueName: \"kubernetes.io/projected/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-kube-api-access-hvz85\") pod \"cinder-a82e-account-create-update-267jv\" (UID: \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\") " pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.489380 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b506fd-90e8-4807-877a-ebb8ec15f09f-operator-scripts\") pod \"barbican-db-create-fvvwn\" (UID: \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\") " pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.490288 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-operator-scripts\") pod \"cinder-a82e-account-create-update-267jv\" (UID: \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\") " pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.490746 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90efc41b-4c77-4a6b-bd78-544fd72b4078-operator-scripts\") pod \"cinder-db-create-vf7gw\" (UID: \"90efc41b-4c77-4a6b-bd78-544fd72b4078\") " pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.523643 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm7vz\" (UniqueName: \"kubernetes.io/projected/90efc41b-4c77-4a6b-bd78-544fd72b4078-kube-api-access-gm7vz\") pod \"cinder-db-create-vf7gw\" (UID: \"90efc41b-4c77-4a6b-bd78-544fd72b4078\") " pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.524929 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvz85\" (UniqueName: \"kubernetes.io/projected/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-kube-api-access-hvz85\") pod \"cinder-a82e-account-create-update-267jv\" (UID: \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\") " pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.590634 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbrhz\" (UniqueName: \"kubernetes.io/projected/d4b506fd-90e8-4807-877a-ebb8ec15f09f-kube-api-access-pbrhz\") pod \"barbican-db-create-fvvwn\" (UID: \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\") " pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.590840 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b506fd-90e8-4807-877a-ebb8ec15f09f-operator-scripts\") pod \"barbican-db-create-fvvwn\" (UID: \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\") " pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.591714 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b506fd-90e8-4807-877a-ebb8ec15f09f-operator-scripts\") pod \"barbican-db-create-fvvwn\" (UID: \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\") " pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.609049 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2aa1-account-create-update-x8g74"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.617389 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.617420 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5kl56"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.620487 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.624347 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.630922 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.637338 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-645x7"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.638654 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.641271 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbrhz\" (UniqueName: \"kubernetes.io/projected/d4b506fd-90e8-4807-877a-ebb8ec15f09f-kube-api-access-pbrhz\") pod \"barbican-db-create-fvvwn\" (UID: \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\") " pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.641755 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.641975 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rs7cl" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.642433 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.642580 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.649538 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.655893 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2aa1-account-create-update-x8g74"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.673296 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5kl56"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.695522 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbtbz\" (UniqueName: \"kubernetes.io/projected/5c123338-9441-4ded-a116-4d7d80f3032a-kube-api-access-cbtbz\") pod \"neutron-db-create-5kl56\" (UID: \"5c123338-9441-4ded-a116-4d7d80f3032a\") " pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.720518 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4sw7\" (UniqueName: \"kubernetes.io/projected/29dea678-28c8-44ea-857d-41437f6f9b24-kube-api-access-v4sw7\") pod \"barbican-2aa1-account-create-update-x8g74\" (UID: \"29dea678-28c8-44ea-857d-41437f6f9b24\") " pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.726550 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dea678-28c8-44ea-857d-41437f6f9b24-operator-scripts\") pod \"barbican-2aa1-account-create-update-x8g74\" (UID: \"29dea678-28c8-44ea-857d-41437f6f9b24\") " pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.726941 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c123338-9441-4ded-a116-4d7d80f3032a-operator-scripts\") pod \"neutron-db-create-5kl56\" (UID: \"5c123338-9441-4ded-a116-4d7d80f3032a\") " pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.701017 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.759713 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-645x7"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.830318 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c123338-9441-4ded-a116-4d7d80f3032a-operator-scripts\") pod \"neutron-db-create-5kl56\" (UID: \"5c123338-9441-4ded-a116-4d7d80f3032a\") " pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.830371 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trnhd\" (UniqueName: \"kubernetes.io/projected/c84e037b-cf95-44a8-b0e5-b3b468a89166-kube-api-access-trnhd\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.830447 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbtbz\" (UniqueName: \"kubernetes.io/projected/5c123338-9441-4ded-a116-4d7d80f3032a-kube-api-access-cbtbz\") pod \"neutron-db-create-5kl56\" (UID: \"5c123338-9441-4ded-a116-4d7d80f3032a\") " pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.830491 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-combined-ca-bundle\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.830550 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sw7\" (UniqueName: \"kubernetes.io/projected/29dea678-28c8-44ea-857d-41437f6f9b24-kube-api-access-v4sw7\") pod \"barbican-2aa1-account-create-update-x8g74\" (UID: \"29dea678-28c8-44ea-857d-41437f6f9b24\") " pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.830583 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-config-data\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.830619 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dea678-28c8-44ea-857d-41437f6f9b24-operator-scripts\") pod \"barbican-2aa1-account-create-update-x8g74\" (UID: \"29dea678-28c8-44ea-857d-41437f6f9b24\") " pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.831584 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dea678-28c8-44ea-857d-41437f6f9b24-operator-scripts\") pod \"barbican-2aa1-account-create-update-x8g74\" (UID: \"29dea678-28c8-44ea-857d-41437f6f9b24\") " pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.832194 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c123338-9441-4ded-a116-4d7d80f3032a-operator-scripts\") pod \"neutron-db-create-5kl56\" (UID: \"5c123338-9441-4ded-a116-4d7d80f3032a\") " pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.852871 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f122-account-create-update-zk7j7"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.853811 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sw7\" (UniqueName: \"kubernetes.io/projected/29dea678-28c8-44ea-857d-41437f6f9b24-kube-api-access-v4sw7\") pod \"barbican-2aa1-account-create-update-x8g74\" (UID: \"29dea678-28c8-44ea-857d-41437f6f9b24\") " pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.854446 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbtbz\" (UniqueName: \"kubernetes.io/projected/5c123338-9441-4ded-a116-4d7d80f3032a-kube-api-access-cbtbz\") pod \"neutron-db-create-5kl56\" (UID: \"5c123338-9441-4ded-a116-4d7d80f3032a\") " pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.855186 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.864696 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f122-account-create-update-zk7j7"] Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.865982 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.933879 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-config-data\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.934005 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trnhd\" (UniqueName: \"kubernetes.io/projected/c84e037b-cf95-44a8-b0e5-b3b468a89166-kube-api-access-trnhd\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.934122 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-combined-ca-bundle\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.943451 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-combined-ca-bundle\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.947461 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-config-data\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:33 crc kubenswrapper[4852]: I1210 12:12:33.953431 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trnhd\" (UniqueName: \"kubernetes.io/projected/c84e037b-cf95-44a8-b0e5-b3b468a89166-kube-api-access-trnhd\") pod \"keystone-db-sync-645x7\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.036552 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b687583-40f8-447f-b8fc-25fe8796f99a-operator-scripts\") pod \"neutron-f122-account-create-update-zk7j7\" (UID: \"9b687583-40f8-447f-b8fc-25fe8796f99a\") " pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.036615 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks4hg\" (UniqueName: \"kubernetes.io/projected/9b687583-40f8-447f-b8fc-25fe8796f99a-kube-api-access-ks4hg\") pod \"neutron-f122-account-create-update-zk7j7\" (UID: \"9b687583-40f8-447f-b8fc-25fe8796f99a\") " pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.080254 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.105628 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.128183 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.138634 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b687583-40f8-447f-b8fc-25fe8796f99a-operator-scripts\") pod \"neutron-f122-account-create-update-zk7j7\" (UID: \"9b687583-40f8-447f-b8fc-25fe8796f99a\") " pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.138697 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks4hg\" (UniqueName: \"kubernetes.io/projected/9b687583-40f8-447f-b8fc-25fe8796f99a-kube-api-access-ks4hg\") pod \"neutron-f122-account-create-update-zk7j7\" (UID: \"9b687583-40f8-447f-b8fc-25fe8796f99a\") " pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.140061 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b687583-40f8-447f-b8fc-25fe8796f99a-operator-scripts\") pod \"neutron-f122-account-create-update-zk7j7\" (UID: \"9b687583-40f8-447f-b8fc-25fe8796f99a\") " pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.159940 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks4hg\" (UniqueName: \"kubernetes.io/projected/9b687583-40f8-447f-b8fc-25fe8796f99a-kube-api-access-ks4hg\") pod \"neutron-f122-account-create-update-zk7j7\" (UID: \"9b687583-40f8-447f-b8fc-25fe8796f99a\") " pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.186594 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.364680 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fvvwn"] Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.381090 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vf7gw"] Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.481792 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a82e-account-create-update-267jv"] Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.644790 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5kl56"] Dec 10 12:12:34 crc kubenswrapper[4852]: W1210 12:12:34.836641 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29dea678_28c8_44ea_857d_41437f6f9b24.slice/crio-8b04aadd5b78dab22cab546d1541555363b0d0543dc7524697afca6df03ec677 WatchSource:0}: Error finding container 8b04aadd5b78dab22cab546d1541555363b0d0543dc7524697afca6df03ec677: Status 404 returned error can't find the container with id 8b04aadd5b78dab22cab546d1541555363b0d0543dc7524697afca6df03ec677 Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.836769 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2aa1-account-create-update-x8g74"] Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.922137 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-645x7"] Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.940213 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gkhdx-config-5d7gg" event={"ID":"7336a6ff-6a4f-4b98-be01-275df300b1ba","Type":"ContainerDied","Data":"6415f17a8dee1419dcfc1b380ef9a5e8ab46000bcd2e6fb308192f17af8ef892"} Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.940270 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6415f17a8dee1419dcfc1b380ef9a5e8ab46000bcd2e6fb308192f17af8ef892" Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.941577 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fvvwn" event={"ID":"d4b506fd-90e8-4807-877a-ebb8ec15f09f","Type":"ContainerStarted","Data":"950f3360c1bf038efae68f190d5c352b37f9757b96fd65573dba59a595cb3918"} Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.942722 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2aa1-account-create-update-x8g74" event={"ID":"29dea678-28c8-44ea-857d-41437f6f9b24","Type":"ContainerStarted","Data":"8b04aadd5b78dab22cab546d1541555363b0d0543dc7524697afca6df03ec677"} Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.943710 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a82e-account-create-update-267jv" event={"ID":"659aefe7-f624-4b85-9c5c-a5aab3a1a95a","Type":"ContainerStarted","Data":"193b7124f27e1f7565a5730c99b88beaab9a08c850620a1761508c7a83b032fb"} Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.944669 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5kl56" event={"ID":"5c123338-9441-4ded-a116-4d7d80f3032a","Type":"ContainerStarted","Data":"df9a2822c563b234c386aa12c5f83028899e28464b8ff232081556bf29281fca"} Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.944766 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f122-account-create-update-zk7j7"] Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.959445 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vf7gw" event={"ID":"90efc41b-4c77-4a6b-bd78-544fd72b4078","Type":"ContainerStarted","Data":"d00bf66bad99bb9a6a0941347a562ea9d68f5f87e5ede995309ff98b0783ee07"} Dec 10 12:12:34 crc kubenswrapper[4852]: I1210 12:12:34.972634 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:34 crc kubenswrapper[4852]: W1210 12:12:34.979750 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b687583_40f8_447f_b8fc_25fe8796f99a.slice/crio-06eed85b33b76e380ed2a85115d53ddeeeefad768bde1a644468c3dd44090f81 WatchSource:0}: Error finding container 06eed85b33b76e380ed2a85115d53ddeeeefad768bde1a644468c3dd44090f81: Status 404 returned error can't find the container with id 06eed85b33b76e380ed2a85115d53ddeeeefad768bde1a644468c3dd44090f81 Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.161353 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-log-ovn\") pod \"7336a6ff-6a4f-4b98-be01-275df300b1ba\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.161479 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run\") pod \"7336a6ff-6a4f-4b98-be01-275df300b1ba\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.161583 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run-ovn\") pod \"7336a6ff-6a4f-4b98-be01-275df300b1ba\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.161631 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-scripts\") pod \"7336a6ff-6a4f-4b98-be01-275df300b1ba\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.161660 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-additional-scripts\") pod \"7336a6ff-6a4f-4b98-be01-275df300b1ba\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.161682 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljw96\" (UniqueName: \"kubernetes.io/projected/7336a6ff-6a4f-4b98-be01-275df300b1ba-kube-api-access-ljw96\") pod \"7336a6ff-6a4f-4b98-be01-275df300b1ba\" (UID: \"7336a6ff-6a4f-4b98-be01-275df300b1ba\") " Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.162410 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7336a6ff-6a4f-4b98-be01-275df300b1ba" (UID: "7336a6ff-6a4f-4b98-be01-275df300b1ba"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.162480 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7336a6ff-6a4f-4b98-be01-275df300b1ba" (UID: "7336a6ff-6a4f-4b98-be01-275df300b1ba"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.162515 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run" (OuterVolumeSpecName: "var-run") pod "7336a6ff-6a4f-4b98-be01-275df300b1ba" (UID: "7336a6ff-6a4f-4b98-be01-275df300b1ba"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.163473 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7336a6ff-6a4f-4b98-be01-275df300b1ba" (UID: "7336a6ff-6a4f-4b98-be01-275df300b1ba"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.166788 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-scripts" (OuterVolumeSpecName: "scripts") pod "7336a6ff-6a4f-4b98-be01-275df300b1ba" (UID: "7336a6ff-6a4f-4b98-be01-275df300b1ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.167378 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7336a6ff-6a4f-4b98-be01-275df300b1ba-kube-api-access-ljw96" (OuterVolumeSpecName: "kube-api-access-ljw96") pod "7336a6ff-6a4f-4b98-be01-275df300b1ba" (UID: "7336a6ff-6a4f-4b98-be01-275df300b1ba"). InnerVolumeSpecName "kube-api-access-ljw96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.263267 4852 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.263307 4852 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.263317 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.263328 4852 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7336a6ff-6a4f-4b98-be01-275df300b1ba-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.263338 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljw96\" (UniqueName: \"kubernetes.io/projected/7336a6ff-6a4f-4b98-be01-275df300b1ba-kube-api-access-ljw96\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:35 crc kubenswrapper[4852]: I1210 12:12:35.263347 4852 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7336a6ff-6a4f-4b98-be01-275df300b1ba-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.005471 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a82e-account-create-update-267jv" event={"ID":"659aefe7-f624-4b85-9c5c-a5aab3a1a95a","Type":"ContainerStarted","Data":"79e9bcb12a3d7a77a06e17d9e6a118924c2191cec910cb5bb6d49f6a784f678c"} Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.022407 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-645x7" event={"ID":"c84e037b-cf95-44a8-b0e5-b3b468a89166","Type":"ContainerStarted","Data":"2a69ab13f993961ba2573ccd55eee39fffc7192dcbbb3db4fb482fe1a8ca891c"} Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.038708 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5kl56" event={"ID":"5c123338-9441-4ded-a116-4d7d80f3032a","Type":"ContainerStarted","Data":"4578bcf7c7353951318b77dfa1c7e841e4135f8d07001766c9b0a493d4fa9b69"} Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.041976 4852 generic.go:334] "Generic (PLEG): container finished" podID="90efc41b-4c77-4a6b-bd78-544fd72b4078" containerID="7fe1bf6a1a2c8b9580c73582bce2d1629adc5fb021b1ab6db4cb9c2295f39451" exitCode=0 Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.042192 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vf7gw" event={"ID":"90efc41b-4c77-4a6b-bd78-544fd72b4078","Type":"ContainerDied","Data":"7fe1bf6a1a2c8b9580c73582bce2d1629adc5fb021b1ab6db4cb9c2295f39451"} Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.045077 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-a82e-account-create-update-267jv" podStartSLOduration=3.04505695 podStartE2EDuration="3.04505695s" podCreationTimestamp="2025-12-10 12:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:36.03184733 +0000 UTC m=+1242.117372554" watchObservedRunningTime="2025-12-10 12:12:36.04505695 +0000 UTC m=+1242.130582164" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.058940 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fvvwn" event={"ID":"d4b506fd-90e8-4807-877a-ebb8ec15f09f","Type":"ContainerStarted","Data":"7924114f3b8bc59cc02f7b301f5475503cbe2cc3c6b41d08048c478a7e2caafe"} Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.088561 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f122-account-create-update-zk7j7" event={"ID":"9b687583-40f8-447f-b8fc-25fe8796f99a","Type":"ContainerStarted","Data":"41036f6903dbcd535fe9cfd575b15e14f222db0f46936332e3592760ebcea532"} Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.088607 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f122-account-create-update-zk7j7" event={"ID":"9b687583-40f8-447f-b8fc-25fe8796f99a","Type":"ContainerStarted","Data":"06eed85b33b76e380ed2a85115d53ddeeeefad768bde1a644468c3dd44090f81"} Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.091729 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx-config-5d7gg" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.091804 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-5kl56" podStartSLOduration=3.091791618 podStartE2EDuration="3.091791618s" podCreationTimestamp="2025-12-10 12:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:36.076562438 +0000 UTC m=+1242.162087652" watchObservedRunningTime="2025-12-10 12:12:36.091791618 +0000 UTC m=+1242.177316842" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.104268 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2aa1-account-create-update-x8g74" event={"ID":"29dea678-28c8-44ea-857d-41437f6f9b24","Type":"ContainerStarted","Data":"eaa96143faf031c6036d4b42e5d7cfb0636f82e5143483d529e280cc9bfcd620"} Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.146543 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-fvvwn" podStartSLOduration=3.146526766 podStartE2EDuration="3.146526766s" podCreationTimestamp="2025-12-10 12:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:36.142581087 +0000 UTC m=+1242.228106311" watchObservedRunningTime="2025-12-10 12:12:36.146526766 +0000 UTC m=+1242.232051990" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.202864 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gkhdx-config-5d7gg"] Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.220425 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gkhdx-config-5d7gg"] Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.233084 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f122-account-create-update-zk7j7" podStartSLOduration=3.233064538 podStartE2EDuration="3.233064538s" podCreationTimestamp="2025-12-10 12:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:36.192011952 +0000 UTC m=+1242.277537176" watchObservedRunningTime="2025-12-10 12:12:36.233064538 +0000 UTC m=+1242.318589772" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.246917 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-2aa1-account-create-update-x8g74" podStartSLOduration=3.246899014 podStartE2EDuration="3.246899014s" podCreationTimestamp="2025-12-10 12:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:36.231416947 +0000 UTC m=+1242.316942171" watchObservedRunningTime="2025-12-10 12:12:36.246899014 +0000 UTC m=+1242.332424238" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.303187 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gkhdx-config-m2tb2"] Dec 10 12:12:36 crc kubenswrapper[4852]: E1210 12:12:36.303555 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7336a6ff-6a4f-4b98-be01-275df300b1ba" containerName="ovn-config" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.303570 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="7336a6ff-6a4f-4b98-be01-275df300b1ba" containerName="ovn-config" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.303728 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="7336a6ff-6a4f-4b98-be01-275df300b1ba" containerName="ovn-config" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.304321 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: W1210 12:12:36.306768 4852 reflector.go:561] object-"openstack"/"ovncontroller-extra-scripts": failed to list *v1.ConfigMap: configmaps "ovncontroller-extra-scripts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 10 12:12:36 crc kubenswrapper[4852]: E1210 12:12:36.306821 4852 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"ovncontroller-extra-scripts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovncontroller-extra-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.320022 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gkhdx-config-m2tb2"] Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.412256 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-log-ovn\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.412322 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7564w\" (UniqueName: \"kubernetes.io/projected/817e43a3-97fb-425b-a46c-21df9c9aba21-kube-api-access-7564w\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.412348 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-scripts\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.412432 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run-ovn\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.412477 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.412500 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.513822 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-scripts\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.513907 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run-ovn\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.513944 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.513963 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.514018 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-log-ovn\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.514038 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7564w\" (UniqueName: \"kubernetes.io/projected/817e43a3-97fb-425b-a46c-21df9c9aba21-kube-api-access-7564w\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.514663 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run-ovn\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.514664 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-log-ovn\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.514930 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.516621 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-scripts\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.545534 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7564w\" (UniqueName: \"kubernetes.io/projected/817e43a3-97fb-425b-a46c-21df9c9aba21-kube-api-access-7564w\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:36 crc kubenswrapper[4852]: I1210 12:12:36.865040 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-gkhdx" Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.109348 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"c9eed7e7cd3c324d7b50792e691069aaa730d65b0bd6c44b3e50713c57788861"} Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.109661 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"7c56278c2f1c2e8126c7df094c88d1573f1cebd3d5b59729efda8732cafce1d8"} Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.109677 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"3ebf9988e32f8bd5e337fee3bb3d362e0e6cbc20ad50371314ef104f8d854687"} Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.112081 4852 generic.go:334] "Generic (PLEG): container finished" podID="659aefe7-f624-4b85-9c5c-a5aab3a1a95a" containerID="79e9bcb12a3d7a77a06e17d9e6a118924c2191cec910cb5bb6d49f6a784f678c" exitCode=0 Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.112131 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a82e-account-create-update-267jv" event={"ID":"659aefe7-f624-4b85-9c5c-a5aab3a1a95a","Type":"ContainerDied","Data":"79e9bcb12a3d7a77a06e17d9e6a118924c2191cec910cb5bb6d49f6a784f678c"} Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.117691 4852 generic.go:334] "Generic (PLEG): container finished" podID="5c123338-9441-4ded-a116-4d7d80f3032a" containerID="4578bcf7c7353951318b77dfa1c7e841e4135f8d07001766c9b0a493d4fa9b69" exitCode=0 Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.117792 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5kl56" event={"ID":"5c123338-9441-4ded-a116-4d7d80f3032a","Type":"ContainerDied","Data":"4578bcf7c7353951318b77dfa1c7e841e4135f8d07001766c9b0a493d4fa9b69"} Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.125288 4852 generic.go:334] "Generic (PLEG): container finished" podID="d4b506fd-90e8-4807-877a-ebb8ec15f09f" containerID="7924114f3b8bc59cc02f7b301f5475503cbe2cc3c6b41d08048c478a7e2caafe" exitCode=0 Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.125485 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fvvwn" event={"ID":"d4b506fd-90e8-4807-877a-ebb8ec15f09f","Type":"ContainerDied","Data":"7924114f3b8bc59cc02f7b301f5475503cbe2cc3c6b41d08048c478a7e2caafe"} Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.130482 4852 generic.go:334] "Generic (PLEG): container finished" podID="9b687583-40f8-447f-b8fc-25fe8796f99a" containerID="41036f6903dbcd535fe9cfd575b15e14f222db0f46936332e3592760ebcea532" exitCode=0 Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.130553 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f122-account-create-update-zk7j7" event={"ID":"9b687583-40f8-447f-b8fc-25fe8796f99a","Type":"ContainerDied","Data":"41036f6903dbcd535fe9cfd575b15e14f222db0f46936332e3592760ebcea532"} Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.132060 4852 generic.go:334] "Generic (PLEG): container finished" podID="29dea678-28c8-44ea-857d-41437f6f9b24" containerID="eaa96143faf031c6036d4b42e5d7cfb0636f82e5143483d529e280cc9bfcd620" exitCode=0 Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.132263 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2aa1-account-create-update-x8g74" event={"ID":"29dea678-28c8-44ea-857d-41437f6f9b24","Type":"ContainerDied","Data":"eaa96143faf031c6036d4b42e5d7cfb0636f82e5143483d529e280cc9bfcd620"} Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.494960 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:37 crc kubenswrapper[4852]: E1210 12:12:37.514609 4852 configmap.go:193] Couldn't get configMap openstack/ovncontroller-extra-scripts: failed to sync configmap cache: timed out waiting for the condition Dec 10 12:12:37 crc kubenswrapper[4852]: E1210 12:12:37.514710 4852 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts podName:817e43a3-97fb-425b-a46c-21df9c9aba21 nodeName:}" failed. No retries permitted until 2025-12-10 12:12:38.014686231 +0000 UTC m=+1244.100211455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "additional-scripts" (UniqueName: "kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts") pod "ovn-controller-gkhdx-config-m2tb2" (UID: "817e43a3-97fb-425b-a46c-21df9c9aba21") : failed to sync configmap cache: timed out waiting for the condition Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.537941 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90efc41b-4c77-4a6b-bd78-544fd72b4078-operator-scripts\") pod \"90efc41b-4c77-4a6b-bd78-544fd72b4078\" (UID: \"90efc41b-4c77-4a6b-bd78-544fd72b4078\") " Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.538264 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm7vz\" (UniqueName: \"kubernetes.io/projected/90efc41b-4c77-4a6b-bd78-544fd72b4078-kube-api-access-gm7vz\") pod \"90efc41b-4c77-4a6b-bd78-544fd72b4078\" (UID: \"90efc41b-4c77-4a6b-bd78-544fd72b4078\") " Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.538795 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90efc41b-4c77-4a6b-bd78-544fd72b4078-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90efc41b-4c77-4a6b-bd78-544fd72b4078" (UID: "90efc41b-4c77-4a6b-bd78-544fd72b4078"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.549505 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90efc41b-4c77-4a6b-bd78-544fd72b4078-kube-api-access-gm7vz" (OuterVolumeSpecName: "kube-api-access-gm7vz") pod "90efc41b-4c77-4a6b-bd78-544fd72b4078" (UID: "90efc41b-4c77-4a6b-bd78-544fd72b4078"). InnerVolumeSpecName "kube-api-access-gm7vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.573960 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.641349 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm7vz\" (UniqueName: \"kubernetes.io/projected/90efc41b-4c77-4a6b-bd78-544fd72b4078-kube-api-access-gm7vz\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:37 crc kubenswrapper[4852]: I1210 12:12:37.641434 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90efc41b-4c77-4a6b-bd78-544fd72b4078-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.047903 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.048689 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts\") pod \"ovn-controller-gkhdx-config-m2tb2\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.119916 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.149244 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"dd014b80eba4c3200e1ba242668cff9db1112fb1e8b0f99fe81f32ce820d6847"} Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.152627 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vf7gw" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.155867 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vf7gw" event={"ID":"90efc41b-4c77-4a6b-bd78-544fd72b4078","Type":"ContainerDied","Data":"d00bf66bad99bb9a6a0941347a562ea9d68f5f87e5ede995309ff98b0783ee07"} Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.155896 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d00bf66bad99bb9a6a0941347a562ea9d68f5f87e5ede995309ff98b0783ee07" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.186148 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7336a6ff-6a4f-4b98-be01-275df300b1ba" path="/var/lib/kubelet/pods/7336a6ff-6a4f-4b98-be01-275df300b1ba/volumes" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.847500 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.860885 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c123338-9441-4ded-a116-4d7d80f3032a-operator-scripts\") pod \"5c123338-9441-4ded-a116-4d7d80f3032a\" (UID: \"5c123338-9441-4ded-a116-4d7d80f3032a\") " Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.861147 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbtbz\" (UniqueName: \"kubernetes.io/projected/5c123338-9441-4ded-a116-4d7d80f3032a-kube-api-access-cbtbz\") pod \"5c123338-9441-4ded-a116-4d7d80f3032a\" (UID: \"5c123338-9441-4ded-a116-4d7d80f3032a\") " Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.866877 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c123338-9441-4ded-a116-4d7d80f3032a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c123338-9441-4ded-a116-4d7d80f3032a" (UID: "5c123338-9441-4ded-a116-4d7d80f3032a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.908797 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c123338-9441-4ded-a116-4d7d80f3032a-kube-api-access-cbtbz" (OuterVolumeSpecName: "kube-api-access-cbtbz") pod "5c123338-9441-4ded-a116-4d7d80f3032a" (UID: "5c123338-9441-4ded-a116-4d7d80f3032a"). InnerVolumeSpecName "kube-api-access-cbtbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.964700 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c123338-9441-4ded-a116-4d7d80f3032a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:38 crc kubenswrapper[4852]: I1210 12:12:38.965288 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbtbz\" (UniqueName: \"kubernetes.io/projected/5c123338-9441-4ded-a116-4d7d80f3032a-kube-api-access-cbtbz\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:39 crc kubenswrapper[4852]: I1210 12:12:39.165365 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5kl56" event={"ID":"5c123338-9441-4ded-a116-4d7d80f3032a","Type":"ContainerDied","Data":"df9a2822c563b234c386aa12c5f83028899e28464b8ff232081556bf29281fca"} Dec 10 12:12:39 crc kubenswrapper[4852]: I1210 12:12:39.165442 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df9a2822c563b234c386aa12c5f83028899e28464b8ff232081556bf29281fca" Dec 10 12:12:39 crc kubenswrapper[4852]: I1210 12:12:39.165455 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5kl56" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.585580 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.625474 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.642975 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b506fd-90e8-4807-877a-ebb8ec15f09f-operator-scripts\") pod \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\" (UID: \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\") " Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.643021 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbrhz\" (UniqueName: \"kubernetes.io/projected/d4b506fd-90e8-4807-877a-ebb8ec15f09f-kube-api-access-pbrhz\") pod \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\" (UID: \"d4b506fd-90e8-4807-877a-ebb8ec15f09f\") " Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.644191 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b506fd-90e8-4807-877a-ebb8ec15f09f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4b506fd-90e8-4807-877a-ebb8ec15f09f" (UID: "d4b506fd-90e8-4807-877a-ebb8ec15f09f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.651979 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b506fd-90e8-4807-877a-ebb8ec15f09f-kube-api-access-pbrhz" (OuterVolumeSpecName: "kube-api-access-pbrhz") pod "d4b506fd-90e8-4807-877a-ebb8ec15f09f" (UID: "d4b506fd-90e8-4807-877a-ebb8ec15f09f"). InnerVolumeSpecName "kube-api-access-pbrhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.707892 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.719974 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.744767 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4sw7\" (UniqueName: \"kubernetes.io/projected/29dea678-28c8-44ea-857d-41437f6f9b24-kube-api-access-v4sw7\") pod \"29dea678-28c8-44ea-857d-41437f6f9b24\" (UID: \"29dea678-28c8-44ea-857d-41437f6f9b24\") " Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.744818 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dea678-28c8-44ea-857d-41437f6f9b24-operator-scripts\") pod \"29dea678-28c8-44ea-857d-41437f6f9b24\" (UID: \"29dea678-28c8-44ea-857d-41437f6f9b24\") " Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.744850 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b687583-40f8-447f-b8fc-25fe8796f99a-operator-scripts\") pod \"9b687583-40f8-447f-b8fc-25fe8796f99a\" (UID: \"9b687583-40f8-447f-b8fc-25fe8796f99a\") " Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.744904 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks4hg\" (UniqueName: \"kubernetes.io/projected/9b687583-40f8-447f-b8fc-25fe8796f99a-kube-api-access-ks4hg\") pod \"9b687583-40f8-447f-b8fc-25fe8796f99a\" (UID: \"9b687583-40f8-447f-b8fc-25fe8796f99a\") " Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.745004 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvz85\" (UniqueName: \"kubernetes.io/projected/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-kube-api-access-hvz85\") pod \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\" (UID: \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\") " Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.745056 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-operator-scripts\") pod \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\" (UID: \"659aefe7-f624-4b85-9c5c-a5aab3a1a95a\") " Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.745394 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4b506fd-90e8-4807-877a-ebb8ec15f09f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.745410 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbrhz\" (UniqueName: \"kubernetes.io/projected/d4b506fd-90e8-4807-877a-ebb8ec15f09f-kube-api-access-pbrhz\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.745786 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "659aefe7-f624-4b85-9c5c-a5aab3a1a95a" (UID: "659aefe7-f624-4b85-9c5c-a5aab3a1a95a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.745800 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b687583-40f8-447f-b8fc-25fe8796f99a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b687583-40f8-447f-b8fc-25fe8796f99a" (UID: "9b687583-40f8-447f-b8fc-25fe8796f99a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.746712 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29dea678-28c8-44ea-857d-41437f6f9b24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29dea678-28c8-44ea-857d-41437f6f9b24" (UID: "29dea678-28c8-44ea-857d-41437f6f9b24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.749597 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29dea678-28c8-44ea-857d-41437f6f9b24-kube-api-access-v4sw7" (OuterVolumeSpecName: "kube-api-access-v4sw7") pod "29dea678-28c8-44ea-857d-41437f6f9b24" (UID: "29dea678-28c8-44ea-857d-41437f6f9b24"). InnerVolumeSpecName "kube-api-access-v4sw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.749831 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b687583-40f8-447f-b8fc-25fe8796f99a-kube-api-access-ks4hg" (OuterVolumeSpecName: "kube-api-access-ks4hg") pod "9b687583-40f8-447f-b8fc-25fe8796f99a" (UID: "9b687583-40f8-447f-b8fc-25fe8796f99a"). InnerVolumeSpecName "kube-api-access-ks4hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.751927 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-kube-api-access-hvz85" (OuterVolumeSpecName: "kube-api-access-hvz85") pod "659aefe7-f624-4b85-9c5c-a5aab3a1a95a" (UID: "659aefe7-f624-4b85-9c5c-a5aab3a1a95a"). InnerVolumeSpecName "kube-api-access-hvz85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.764307 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gkhdx-config-m2tb2"] Dec 10 12:12:43 crc kubenswrapper[4852]: W1210 12:12:43.778416 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod817e43a3_97fb_425b_a46c_21df9c9aba21.slice/crio-dfaaeebefdc83f40483cd4d40a1123564fc60ae7a2b60f1f55a6ff3eebc76238 WatchSource:0}: Error finding container dfaaeebefdc83f40483cd4d40a1123564fc60ae7a2b60f1f55a6ff3eebc76238: Status 404 returned error can't find the container with id dfaaeebefdc83f40483cd4d40a1123564fc60ae7a2b60f1f55a6ff3eebc76238 Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.846996 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4sw7\" (UniqueName: \"kubernetes.io/projected/29dea678-28c8-44ea-857d-41437f6f9b24-kube-api-access-v4sw7\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.847045 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dea678-28c8-44ea-857d-41437f6f9b24-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.847060 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b687583-40f8-447f-b8fc-25fe8796f99a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.847071 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks4hg\" (UniqueName: \"kubernetes.io/projected/9b687583-40f8-447f-b8fc-25fe8796f99a-kube-api-access-ks4hg\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.847085 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvz85\" (UniqueName: \"kubernetes.io/projected/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-kube-api-access-hvz85\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:43 crc kubenswrapper[4852]: I1210 12:12:43.847096 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/659aefe7-f624-4b85-9c5c-a5aab3a1a95a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.221240 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2aa1-account-create-update-x8g74" event={"ID":"29dea678-28c8-44ea-857d-41437f6f9b24","Type":"ContainerDied","Data":"8b04aadd5b78dab22cab546d1541555363b0d0543dc7524697afca6df03ec677"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.221586 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b04aadd5b78dab22cab546d1541555363b0d0543dc7524697afca6df03ec677" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.221673 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2aa1-account-create-update-x8g74" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.224961 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a82e-account-create-update-267jv" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.225063 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a82e-account-create-update-267jv" event={"ID":"659aefe7-f624-4b85-9c5c-a5aab3a1a95a","Type":"ContainerDied","Data":"193b7124f27e1f7565a5730c99b88beaab9a08c850620a1761508c7a83b032fb"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.225101 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="193b7124f27e1f7565a5730c99b88beaab9a08c850620a1761508c7a83b032fb" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.228995 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"27b879696324de9279c031bf044109bd67c4b07ff0fc3c418cc79c9c066229f3"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.229038 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"5a2134873d1f21e5dd968d5af66070baf89f455b7b166da02cb0f493c5fa17d7"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.229049 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"c9d9528bbd4b6a47ddd77a00c4a9cf0b179726de6edc1b113567ee89501940d6"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.229057 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"115a0d83b8e619b355efb4bea23f63bdb5bcc7e9c15752467caffaf2a3cfe63a"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.238004 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-645x7" event={"ID":"c84e037b-cf95-44a8-b0e5-b3b468a89166","Type":"ContainerStarted","Data":"d0b3e7795546ac037395e937bf6b92120661c52eed7432cd66bdf290b39b19a3"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.242619 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fvvwn" event={"ID":"d4b506fd-90e8-4807-877a-ebb8ec15f09f","Type":"ContainerDied","Data":"950f3360c1bf038efae68f190d5c352b37f9757b96fd65573dba59a595cb3918"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.242745 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950f3360c1bf038efae68f190d5c352b37f9757b96fd65573dba59a595cb3918" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.242879 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fvvwn" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.246735 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f122-account-create-update-zk7j7" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.246755 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f122-account-create-update-zk7j7" event={"ID":"9b687583-40f8-447f-b8fc-25fe8796f99a","Type":"ContainerDied","Data":"06eed85b33b76e380ed2a85115d53ddeeeefad768bde1a644468c3dd44090f81"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.246787 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06eed85b33b76e380ed2a85115d53ddeeeefad768bde1a644468c3dd44090f81" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.252191 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gkhdx-config-m2tb2" event={"ID":"817e43a3-97fb-425b-a46c-21df9c9aba21","Type":"ContainerStarted","Data":"f0a8c30e5408dcb44ac3d5a10083d6fd81f0da70fcc45393280458a7d25d7a8c"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.252307 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gkhdx-config-m2tb2" event={"ID":"817e43a3-97fb-425b-a46c-21df9c9aba21","Type":"ContainerStarted","Data":"dfaaeebefdc83f40483cd4d40a1123564fc60ae7a2b60f1f55a6ff3eebc76238"} Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.266970 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-645x7" podStartSLOduration=2.864446736 podStartE2EDuration="11.26695439s" podCreationTimestamp="2025-12-10 12:12:33 +0000 UTC" firstStartedPulling="2025-12-10 12:12:34.973145456 +0000 UTC m=+1241.058670680" lastFinishedPulling="2025-12-10 12:12:43.37565311 +0000 UTC m=+1249.461178334" observedRunningTime="2025-12-10 12:12:44.258446197 +0000 UTC m=+1250.343971441" watchObservedRunningTime="2025-12-10 12:12:44.26695439 +0000 UTC m=+1250.352479604" Dec 10 12:12:44 crc kubenswrapper[4852]: I1210 12:12:44.284886 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gkhdx-config-m2tb2" podStartSLOduration=8.284869408 podStartE2EDuration="8.284869408s" podCreationTimestamp="2025-12-10 12:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:44.278687893 +0000 UTC m=+1250.364213127" watchObservedRunningTime="2025-12-10 12:12:44.284869408 +0000 UTC m=+1250.370394632" Dec 10 12:12:45 crc kubenswrapper[4852]: I1210 12:12:45.261159 4852 generic.go:334] "Generic (PLEG): container finished" podID="817e43a3-97fb-425b-a46c-21df9c9aba21" containerID="f0a8c30e5408dcb44ac3d5a10083d6fd81f0da70fcc45393280458a7d25d7a8c" exitCode=0 Dec 10 12:12:45 crc kubenswrapper[4852]: I1210 12:12:45.261310 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gkhdx-config-m2tb2" event={"ID":"817e43a3-97fb-425b-a46c-21df9c9aba21","Type":"ContainerDied","Data":"f0a8c30e5408dcb44ac3d5a10083d6fd81f0da70fcc45393280458a7d25d7a8c"} Dec 10 12:12:45 crc kubenswrapper[4852]: I1210 12:12:45.264663 4852 generic.go:334] "Generic (PLEG): container finished" podID="bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" containerID="71815324b55fbf56089a5d2f7f0ba6c1662438abf686b0a3e3c13f263a05652b" exitCode=0 Dec 10 12:12:45 crc kubenswrapper[4852]: I1210 12:12:45.264725 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8f6wn" event={"ID":"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f","Type":"ContainerDied","Data":"71815324b55fbf56089a5d2f7f0ba6c1662438abf686b0a3e3c13f263a05652b"} Dec 10 12:12:45 crc kubenswrapper[4852]: I1210 12:12:45.789697 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:12:45 crc kubenswrapper[4852]: I1210 12:12:45.789764 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.281221 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"9f04085a8fd10b0c772231381cb825d4325eca672cea35b60f91b23445338157"} Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.281554 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"6ba802115b976f169f479a225ab5888b20ad8784f6ce12a79cc906b6ec56f53d"} Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.281625 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"dfa6cfba16f2a0e8709a1953efe8210ec96dfecae81c1e097505f4d793431602"} Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.281640 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"0e95ce40d3fd6a852cee1df5691a191eef3d6784155f7ea296438088f309e545"} Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.521149 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.536931 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run\") pod \"817e43a3-97fb-425b-a46c-21df9c9aba21\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.536998 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-log-ovn\") pod \"817e43a3-97fb-425b-a46c-21df9c9aba21\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.537024 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7564w\" (UniqueName: \"kubernetes.io/projected/817e43a3-97fb-425b-a46c-21df9c9aba21-kube-api-access-7564w\") pod \"817e43a3-97fb-425b-a46c-21df9c9aba21\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.537048 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts\") pod \"817e43a3-97fb-425b-a46c-21df9c9aba21\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.537085 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run-ovn\") pod \"817e43a3-97fb-425b-a46c-21df9c9aba21\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.537125 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-scripts\") pod \"817e43a3-97fb-425b-a46c-21df9c9aba21\" (UID: \"817e43a3-97fb-425b-a46c-21df9c9aba21\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.538382 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-scripts" (OuterVolumeSpecName: "scripts") pod "817e43a3-97fb-425b-a46c-21df9c9aba21" (UID: "817e43a3-97fb-425b-a46c-21df9c9aba21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.538934 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "817e43a3-97fb-425b-a46c-21df9c9aba21" (UID: "817e43a3-97fb-425b-a46c-21df9c9aba21"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.538970 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run" (OuterVolumeSpecName: "var-run") pod "817e43a3-97fb-425b-a46c-21df9c9aba21" (UID: "817e43a3-97fb-425b-a46c-21df9c9aba21"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.538991 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "817e43a3-97fb-425b-a46c-21df9c9aba21" (UID: "817e43a3-97fb-425b-a46c-21df9c9aba21"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.539081 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "817e43a3-97fb-425b-a46c-21df9c9aba21" (UID: "817e43a3-97fb-425b-a46c-21df9c9aba21"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.551520 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/817e43a3-97fb-425b-a46c-21df9c9aba21-kube-api-access-7564w" (OuterVolumeSpecName: "kube-api-access-7564w") pod "817e43a3-97fb-425b-a46c-21df9c9aba21" (UID: "817e43a3-97fb-425b-a46c-21df9c9aba21"). InnerVolumeSpecName "kube-api-access-7564w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.638821 4852 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.639114 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.639136 4852 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-run\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.639151 4852 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/817e43a3-97fb-425b-a46c-21df9c9aba21-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.639164 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7564w\" (UniqueName: \"kubernetes.io/projected/817e43a3-97fb-425b-a46c-21df9c9aba21-kube-api-access-7564w\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.639175 4852 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/817e43a3-97fb-425b-a46c-21df9c9aba21-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.731784 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.822581 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gkhdx-config-m2tb2"] Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.834063 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gkhdx-config-m2tb2"] Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.842166 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qklwk\" (UniqueName: \"kubernetes.io/projected/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-kube-api-access-qklwk\") pod \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.842292 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-combined-ca-bundle\") pod \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.842344 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-config-data\") pod \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.842391 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-db-sync-config-data\") pod \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\" (UID: \"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f\") " Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.849848 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-kube-api-access-qklwk" (OuterVolumeSpecName: "kube-api-access-qklwk") pod "bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" (UID: "bf4bd98a-477c-4b9a-8cab-ccb69f404b1f"). InnerVolumeSpecName "kube-api-access-qklwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.854376 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" (UID: "bf4bd98a-477c-4b9a-8cab-ccb69f404b1f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.889239 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" (UID: "bf4bd98a-477c-4b9a-8cab-ccb69f404b1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.894538 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-config-data" (OuterVolumeSpecName: "config-data") pod "bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" (UID: "bf4bd98a-477c-4b9a-8cab-ccb69f404b1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.944437 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qklwk\" (UniqueName: \"kubernetes.io/projected/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-kube-api-access-qklwk\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.944472 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.944484 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:46 crc kubenswrapper[4852]: I1210 12:12:46.944496 4852 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.297081 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"de3cbeb3c962a0d2b312f7ac452753dcfe84a016d4814b5e6ed9d27ec396fbe1"} Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.297435 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"4548ea793adc1527121de77b05ea9f1ed28c70cb17233baa10c373a4c5fd3a1e"} Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.297452 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"41d04c65-c8a1-472a-bc74-6b20bec61fbc","Type":"ContainerStarted","Data":"4f596c4ebf133027bc3b8cb4283bd36fc8fef69e059c6fce2d148b1951ec26f2"} Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.299357 4852 generic.go:334] "Generic (PLEG): container finished" podID="c84e037b-cf95-44a8-b0e5-b3b468a89166" containerID="d0b3e7795546ac037395e937bf6b92120661c52eed7432cd66bdf290b39b19a3" exitCode=0 Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.299420 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-645x7" event={"ID":"c84e037b-cf95-44a8-b0e5-b3b468a89166","Type":"ContainerDied","Data":"d0b3e7795546ac037395e937bf6b92120661c52eed7432cd66bdf290b39b19a3"} Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.301418 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8f6wn" event={"ID":"bf4bd98a-477c-4b9a-8cab-ccb69f404b1f","Type":"ContainerDied","Data":"5c4255b93e988e144ca25a6681a29666d4c954547462176c3e2c129943852094"} Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.301476 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c4255b93e988e144ca25a6681a29666d4c954547462176c3e2c129943852094" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.301493 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8f6wn" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.307840 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfaaeebefdc83f40483cd4d40a1123564fc60ae7a2b60f1f55a6ff3eebc76238" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.307889 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gkhdx-config-m2tb2" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.347926 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.205641639 podStartE2EDuration="49.347905404s" podCreationTimestamp="2025-12-10 12:11:58 +0000 UTC" firstStartedPulling="2025-12-10 12:12:32.358894024 +0000 UTC m=+1238.444419238" lastFinishedPulling="2025-12-10 12:12:45.501157769 +0000 UTC m=+1251.586683003" observedRunningTime="2025-12-10 12:12:47.342589782 +0000 UTC m=+1253.428115066" watchObservedRunningTime="2025-12-10 12:12:47.347905404 +0000 UTC m=+1253.433430628" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.688683 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-h45dx"] Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.689037 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90efc41b-4c77-4a6b-bd78-544fd72b4078" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689061 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="90efc41b-4c77-4a6b-bd78-544fd72b4078" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.689074 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b687583-40f8-447f-b8fc-25fe8796f99a" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689080 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b687583-40f8-447f-b8fc-25fe8796f99a" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.689093 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817e43a3-97fb-425b-a46c-21df9c9aba21" containerName="ovn-config" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689099 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="817e43a3-97fb-425b-a46c-21df9c9aba21" containerName="ovn-config" Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.689109 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29dea678-28c8-44ea-857d-41437f6f9b24" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689115 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="29dea678-28c8-44ea-857d-41437f6f9b24" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.689127 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b506fd-90e8-4807-877a-ebb8ec15f09f" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689133 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b506fd-90e8-4807-877a-ebb8ec15f09f" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.689142 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" containerName="glance-db-sync" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689149 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" containerName="glance-db-sync" Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.689165 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659aefe7-f624-4b85-9c5c-a5aab3a1a95a" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689174 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="659aefe7-f624-4b85-9c5c-a5aab3a1a95a" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.689188 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c123338-9441-4ded-a116-4d7d80f3032a" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689196 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c123338-9441-4ded-a116-4d7d80f3032a" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689468 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="659aefe7-f624-4b85-9c5c-a5aab3a1a95a" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689485 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b687583-40f8-447f-b8fc-25fe8796f99a" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689498 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c123338-9441-4ded-a116-4d7d80f3032a" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689507 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b506fd-90e8-4807-877a-ebb8ec15f09f" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689520 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="90efc41b-4c77-4a6b-bd78-544fd72b4078" containerName="mariadb-database-create" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689529 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="29dea678-28c8-44ea-857d-41437f6f9b24" containerName="mariadb-account-create-update" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689538 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="817e43a3-97fb-425b-a46c-21df9c9aba21" containerName="ovn-config" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.689551 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" containerName="glance-db-sync" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.690362 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.692389 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.717334 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-h45dx"] Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.808287 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-h45dx"] Dec 10 12:12:47 crc kubenswrapper[4852]: E1210 12:12:47.808856 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-wtlfr ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" podUID="e286f7ab-9083-4c00-b69d-f466327ffec8" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.842019 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-4b9mk"] Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.843443 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.849900 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-4b9mk"] Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.872912 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtlfr\" (UniqueName: \"kubernetes.io/projected/e286f7ab-9083-4c00-b69d-f466327ffec8-kube-api-access-wtlfr\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.873019 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.873084 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-config\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.873148 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.873178 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.873215 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974471 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974530 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-config\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974561 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974581 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtlfr\" (UniqueName: \"kubernetes.io/projected/e286f7ab-9083-4c00-b69d-f466327ffec8-kube-api-access-wtlfr\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974620 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrw8b\" (UniqueName: \"kubernetes.io/projected/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-kube-api-access-mrw8b\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974666 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974698 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974732 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974757 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-config\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974782 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974798 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.974812 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.975461 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.975914 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.976131 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.976347 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-config\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.976520 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:47 crc kubenswrapper[4852]: I1210 12:12:47.997185 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtlfr\" (UniqueName: \"kubernetes.io/projected/e286f7ab-9083-4c00-b69d-f466327ffec8-kube-api-access-wtlfr\") pod \"dnsmasq-dns-5c79d794d7-h45dx\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.075954 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrw8b\" (UniqueName: \"kubernetes.io/projected/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-kube-api-access-mrw8b\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.076033 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.076061 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.076097 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.076128 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-config\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.076152 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.077298 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.078057 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.078675 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.079327 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-config\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.079833 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.096975 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrw8b\" (UniqueName: \"kubernetes.io/projected/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-kube-api-access-mrw8b\") pod \"dnsmasq-dns-5f59b8f679-4b9mk\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.162945 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.180818 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="817e43a3-97fb-425b-a46c-21df9c9aba21" path="/var/lib/kubelet/pods/817e43a3-97fb-425b-a46c-21df9c9aba21/volumes" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.318427 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.333527 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.484735 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-swift-storage-0\") pod \"e286f7ab-9083-4c00-b69d-f466327ffec8\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.484823 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-nb\") pod \"e286f7ab-9083-4c00-b69d-f466327ffec8\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.484946 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-svc\") pod \"e286f7ab-9083-4c00-b69d-f466327ffec8\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.484979 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-sb\") pod \"e286f7ab-9083-4c00-b69d-f466327ffec8\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.485041 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-config\") pod \"e286f7ab-9083-4c00-b69d-f466327ffec8\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.485175 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtlfr\" (UniqueName: \"kubernetes.io/projected/e286f7ab-9083-4c00-b69d-f466327ffec8-kube-api-access-wtlfr\") pod \"e286f7ab-9083-4c00-b69d-f466327ffec8\" (UID: \"e286f7ab-9083-4c00-b69d-f466327ffec8\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.488928 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e286f7ab-9083-4c00-b69d-f466327ffec8" (UID: "e286f7ab-9083-4c00-b69d-f466327ffec8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.489075 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-config" (OuterVolumeSpecName: "config") pod "e286f7ab-9083-4c00-b69d-f466327ffec8" (UID: "e286f7ab-9083-4c00-b69d-f466327ffec8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.489395 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e286f7ab-9083-4c00-b69d-f466327ffec8" (UID: "e286f7ab-9083-4c00-b69d-f466327ffec8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.489832 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e286f7ab-9083-4c00-b69d-f466327ffec8" (UID: "e286f7ab-9083-4c00-b69d-f466327ffec8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.489893 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e286f7ab-9083-4c00-b69d-f466327ffec8" (UID: "e286f7ab-9083-4c00-b69d-f466327ffec8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.492669 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e286f7ab-9083-4c00-b69d-f466327ffec8-kube-api-access-wtlfr" (OuterVolumeSpecName: "kube-api-access-wtlfr") pod "e286f7ab-9083-4c00-b69d-f466327ffec8" (UID: "e286f7ab-9083-4c00-b69d-f466327ffec8"). InnerVolumeSpecName "kube-api-access-wtlfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.587518 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.587561 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.587576 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.587589 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtlfr\" (UniqueName: \"kubernetes.io/projected/e286f7ab-9083-4c00-b69d-f466327ffec8-kube-api-access-wtlfr\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.587600 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.587612 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e286f7ab-9083-4c00-b69d-f466327ffec8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.589623 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-4b9mk"] Dec 10 12:12:48 crc kubenswrapper[4852]: W1210 12:12:48.597659 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a3fd8e9_e2f9_4a5f_a97a_3fcbd8003081.slice/crio-f26e10a62fe2667dbed96768b1403caaeb17d330fcd15fcff157557f0ea34d13 WatchSource:0}: Error finding container f26e10a62fe2667dbed96768b1403caaeb17d330fcd15fcff157557f0ea34d13: Status 404 returned error can't find the container with id f26e10a62fe2667dbed96768b1403caaeb17d330fcd15fcff157557f0ea34d13 Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.745886 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.892492 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trnhd\" (UniqueName: \"kubernetes.io/projected/c84e037b-cf95-44a8-b0e5-b3b468a89166-kube-api-access-trnhd\") pod \"c84e037b-cf95-44a8-b0e5-b3b468a89166\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.892615 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-combined-ca-bundle\") pod \"c84e037b-cf95-44a8-b0e5-b3b468a89166\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.892652 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-config-data\") pod \"c84e037b-cf95-44a8-b0e5-b3b468a89166\" (UID: \"c84e037b-cf95-44a8-b0e5-b3b468a89166\") " Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.913575 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c84e037b-cf95-44a8-b0e5-b3b468a89166-kube-api-access-trnhd" (OuterVolumeSpecName: "kube-api-access-trnhd") pod "c84e037b-cf95-44a8-b0e5-b3b468a89166" (UID: "c84e037b-cf95-44a8-b0e5-b3b468a89166"). InnerVolumeSpecName "kube-api-access-trnhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.956193 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-config-data" (OuterVolumeSpecName: "config-data") pod "c84e037b-cf95-44a8-b0e5-b3b468a89166" (UID: "c84e037b-cf95-44a8-b0e5-b3b468a89166"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.963363 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c84e037b-cf95-44a8-b0e5-b3b468a89166" (UID: "c84e037b-cf95-44a8-b0e5-b3b468a89166"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.998359 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.998404 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trnhd\" (UniqueName: \"kubernetes.io/projected/c84e037b-cf95-44a8-b0e5-b3b468a89166-kube-api-access-trnhd\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:48 crc kubenswrapper[4852]: I1210 12:12:48.998418 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84e037b-cf95-44a8-b0e5-b3b468a89166-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.326271 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-645x7" event={"ID":"c84e037b-cf95-44a8-b0e5-b3b468a89166","Type":"ContainerDied","Data":"2a69ab13f993961ba2573ccd55eee39fffc7192dcbbb3db4fb482fe1a8ca891c"} Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.326316 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a69ab13f993961ba2573ccd55eee39fffc7192dcbbb3db4fb482fe1a8ca891c" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.326297 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-645x7" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.327910 4852 generic.go:334] "Generic (PLEG): container finished" podID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" containerID="18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb" exitCode=0 Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.327961 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-h45dx" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.327947 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" event={"ID":"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081","Type":"ContainerDied","Data":"18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb"} Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.327994 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" event={"ID":"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081","Type":"ContainerStarted","Data":"f26e10a62fe2667dbed96768b1403caaeb17d330fcd15fcff157557f0ea34d13"} Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.407081 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-h45dx"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.424016 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-h45dx"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.549493 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-4b9mk"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.580055 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dgswv"] Dec 10 12:12:49 crc kubenswrapper[4852]: E1210 12:12:49.580432 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84e037b-cf95-44a8-b0e5-b3b468a89166" containerName="keystone-db-sync" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.580450 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84e037b-cf95-44a8-b0e5-b3b468a89166" containerName="keystone-db-sync" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.580622 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="c84e037b-cf95-44a8-b0e5-b3b468a89166" containerName="keystone-db-sync" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.581437 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.605823 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dgswv"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.627880 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qvqxs"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.629132 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.632885 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.632993 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.632906 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.633359 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rs7cl" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.633540 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.671175 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qvqxs"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.711278 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.711630 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.711695 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.711742 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.711817 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n84xf\" (UniqueName: \"kubernetes.io/projected/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-kube-api-access-n84xf\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.711860 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-config\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.791813 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d6fc7b549-vj5mb"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.793469 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.801801 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.801867 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-wzlcl" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.802145 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.802420 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.806765 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d6fc7b549-vj5mb"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.816930 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-credential-keys\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.816997 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-scripts\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817028 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817079 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817150 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n84xf\" (UniqueName: \"kubernetes.io/projected/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-kube-api-access-n84xf\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817184 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-combined-ca-bundle\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817215 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-config\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817285 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6qw\" (UniqueName: \"kubernetes.io/projected/113430a5-0056-4448-918d-e77a8feb53fd-kube-api-access-2t6qw\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817325 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817351 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817375 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-fernet-keys\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.817407 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-config-data\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.818490 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.819098 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-config\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.826083 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.829396 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.835897 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.845288 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bw8zp"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.846812 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.859143 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.859418 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.859634 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wxml9" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.867376 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bw8zp"] Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.881816 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n84xf\" (UniqueName: \"kubernetes.io/projected/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-kube-api-access-n84xf\") pod \"dnsmasq-dns-bbf5cc879-dgswv\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921192 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397ac00-7478-4849-900c-cd19b2c83305-logs\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921273 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-combined-ca-bundle\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921331 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8397ac00-7478-4849-900c-cd19b2c83305-horizon-secret-key\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921368 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6qw\" (UniqueName: \"kubernetes.io/projected/113430a5-0056-4448-918d-e77a8feb53fd-kube-api-access-2t6qw\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921406 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-fernet-keys\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921428 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-scripts\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921452 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-config-data\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921476 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-credential-keys\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921496 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-config-data\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921517 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2tvf\" (UniqueName: \"kubernetes.io/projected/8397ac00-7478-4849-900c-cd19b2c83305-kube-api-access-x2tvf\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.921540 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-scripts\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.943665 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:49 crc kubenswrapper[4852]: I1210 12:12:49.958979 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-config-data\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.003444 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-combined-ca-bundle\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.007829 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-credential-keys\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.009059 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6qw\" (UniqueName: \"kubernetes.io/projected/113430a5-0056-4448-918d-e77a8feb53fd-kube-api-access-2t6qw\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.009380 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-scripts\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.021327 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-fernet-keys\") pod \"keystone-bootstrap-qvqxs\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.028510 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.030854 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56x6w\" (UniqueName: \"kubernetes.io/projected/5db4372d-41b4-4247-97ab-7f27026c2a82-kube-api-access-56x6w\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.030903 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397ac00-7478-4849-900c-cd19b2c83305-logs\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.030957 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-combined-ca-bundle\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.030987 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-config\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.031008 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8397ac00-7478-4849-900c-cd19b2c83305-horizon-secret-key\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.031048 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-scripts\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.031080 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-config-data\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.031101 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2tvf\" (UniqueName: \"kubernetes.io/projected/8397ac00-7478-4849-900c-cd19b2c83305-kube-api-access-x2tvf\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.031789 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397ac00-7478-4849-900c-cd19b2c83305-logs\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.053501 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-scripts\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.055561 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.058573 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-config-data\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.063702 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.063964 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.073827 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2tvf\" (UniqueName: \"kubernetes.io/projected/8397ac00-7478-4849-900c-cd19b2c83305-kube-api-access-x2tvf\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.096821 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8397ac00-7478-4849-900c-cd19b2c83305-horizon-secret-key\") pod \"horizon-7d6fc7b549-vj5mb\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.102282 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.116603 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qwzww"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.117699 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.120280 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qwzww"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.127607 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.128049 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.128286 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zdp77" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.142712 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-combined-ca-bundle\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.142778 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.142807 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-run-httpd\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.142848 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-config\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.142875 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-config-data\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.142908 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.142992 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-scripts\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.143020 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-log-httpd\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.143081 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56x6w\" (UniqueName: \"kubernetes.io/projected/5db4372d-41b4-4247-97ab-7f27026c2a82-kube-api-access-56x6w\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.143124 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnzd2\" (UniqueName: \"kubernetes.io/projected/02730e24-a11e-4c7b-9470-9290b251bcb9-kube-api-access-dnzd2\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.159239 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.163326 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-config\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.186414 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-combined-ca-bundle\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.198008 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56x6w\" (UniqueName: \"kubernetes.io/projected/5db4372d-41b4-4247-97ab-7f27026c2a82-kube-api-access-56x6w\") pod \"neutron-db-sync-bw8zp\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.245332 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e286f7ab-9083-4c00-b69d-f466327ffec8" path="/var/lib/kubelet/pods/e286f7ab-9083-4c00-b69d-f466327ffec8/volumes" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.245891 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mwvfs"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246361 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-etc-machine-id\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246396 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wsdq\" (UniqueName: \"kubernetes.io/projected/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-kube-api-access-8wsdq\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246422 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246442 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-run-httpd\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246463 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-config-data\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246486 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246521 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-scripts\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246541 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-db-sync-config-data\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246564 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-scripts\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246582 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-log-httpd\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246604 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-combined-ca-bundle\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246654 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnzd2\" (UniqueName: \"kubernetes.io/projected/02730e24-a11e-4c7b-9470-9290b251bcb9-kube-api-access-dnzd2\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.246672 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-config-data\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.247429 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mwvfs"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.247525 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.248805 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.249449 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.252736 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.252989 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-run-httpd\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.276000 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-log-httpd\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.277330 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.277476 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.277624 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-54ckn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.277652 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.278202 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.280060 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bslbz" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.280583 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.286904 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d8d7b9c7-kcmvn"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.287533 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.296780 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-scripts\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.297822 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.299212 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dgswv"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.303627 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-config-data\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.310203 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnzd2\" (UniqueName: \"kubernetes.io/projected/02730e24-a11e-4c7b-9470-9290b251bcb9-kube-api-access-dnzd2\") pod \"ceilometer-0\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.312642 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d8d7b9c7-kcmvn"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.330345 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ml9gs"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.335368 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.338273 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.338552 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xsg7h" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.338798 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.343639 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ml9gs"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347612 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-config-data\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347658 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtc2\" (UniqueName: \"kubernetes.io/projected/f7444e87-8857-4aee-b9c6-5797c8203b8d-kube-api-access-zrtc2\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347694 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347738 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-etc-machine-id\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347756 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wsdq\" (UniqueName: \"kubernetes.io/projected/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-kube-api-access-8wsdq\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347773 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-db-sync-config-data\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347796 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347827 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-combined-ca-bundle\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347909 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-scripts\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347927 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpfpb\" (UniqueName: \"kubernetes.io/projected/71453d1b-e7a6-44a3-a449-b1c10eb76997-kube-api-access-dpfpb\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347974 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-db-sync-config-data\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.347991 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.348017 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.348043 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.348072 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-combined-ca-bundle\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.348122 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.353255 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l4k5p"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.354578 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.355147 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-etc-machine-id\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.358652 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" event={"ID":"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081","Type":"ContainerStarted","Data":"f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4"} Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.359280 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.360175 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-config-data\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.367626 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-scripts\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.368399 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-db-sync-config-data\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.369759 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-combined-ca-bundle\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.373644 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l4k5p"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.380439 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wsdq\" (UniqueName: \"kubernetes.io/projected/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-kube-api-access-8wsdq\") pod \"cinder-db-sync-qwzww\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.386494 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.388458 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.389739 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.400410 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.402966 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449688 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449769 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449794 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/517ef493-1599-408d-bf6d-0e0eaef4d28c-logs\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449826 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-db-sync-config-data\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449847 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cbzn\" (UniqueName: \"kubernetes.io/projected/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-kube-api-access-9cbzn\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449866 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449888 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-combined-ca-bundle\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449922 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-scripts\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449943 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-config\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449957 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449972 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.449996 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-scripts\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.450020 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-scripts\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.450037 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpfpb\" (UniqueName: \"kubernetes.io/projected/71453d1b-e7a6-44a3-a449-b1c10eb76997-kube-api-access-dpfpb\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.450058 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-config-data\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.451704 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452321 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452347 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452365 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-combined-ca-bundle\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452404 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452519 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452545 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7w9b\" (UniqueName: \"kubernetes.io/projected/b9ace072-49c8-4747-b7d5-1f6f01393c41-kube-api-access-r7w9b\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452595 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pt6x\" (UniqueName: \"kubernetes.io/projected/517ef493-1599-408d-bf6d-0e0eaef4d28c-kube-api-access-9pt6x\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452615 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452632 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-642d8\" (UniqueName: \"kubernetes.io/projected/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-kube-api-access-642d8\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452655 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-config-data\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452673 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-config-data\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452690 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452711 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452796 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-logs\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452813 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452855 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-logs\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452875 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-horizon-secret-key\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.452898 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtc2\" (UniqueName: \"kubernetes.io/projected/f7444e87-8857-4aee-b9c6-5797c8203b8d-kube-api-access-zrtc2\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.453481 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.455870 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-db-sync-config-data\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.458022 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.458547 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.472924 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.483080 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-combined-ca-bundle\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.486070 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.488641 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpfpb\" (UniqueName: \"kubernetes.io/projected/71453d1b-e7a6-44a3-a449-b1c10eb76997-kube-api-access-dpfpb\") pod \"barbican-db-sync-mwvfs\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.488981 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtc2\" (UniqueName: \"kubernetes.io/projected/f7444e87-8857-4aee-b9c6-5797c8203b8d-kube-api-access-zrtc2\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.506497 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" podStartSLOduration=3.506478907 podStartE2EDuration="3.506478907s" podCreationTimestamp="2025-12-10 12:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:50.421171796 +0000 UTC m=+1256.506697030" watchObservedRunningTime="2025-12-10 12:12:50.506478907 +0000 UTC m=+1256.592004131" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.514899 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.516643 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.536884 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qwzww" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562459 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-scripts\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562497 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-config\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562517 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562533 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562552 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-scripts\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562572 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-scripts\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562595 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-config-data\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562614 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562630 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-combined-ca-bundle\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562657 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7w9b\" (UniqueName: \"kubernetes.io/projected/b9ace072-49c8-4747-b7d5-1f6f01393c41-kube-api-access-r7w9b\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562674 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pt6x\" (UniqueName: \"kubernetes.io/projected/517ef493-1599-408d-bf6d-0e0eaef4d28c-kube-api-access-9pt6x\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562689 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562703 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-642d8\" (UniqueName: \"kubernetes.io/projected/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-kube-api-access-642d8\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562723 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-config-data\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562739 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-config-data\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562756 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562785 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-logs\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562802 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562817 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-logs\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562833 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-horizon-secret-key\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562856 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562873 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/517ef493-1599-408d-bf6d-0e0eaef4d28c-logs\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.562892 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cbzn\" (UniqueName: \"kubernetes.io/projected/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-kube-api-access-9cbzn\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.566346 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-logs\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.567169 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-scripts\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.569269 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/517ef493-1599-408d-bf6d-0e0eaef4d28c-logs\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.570770 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.574033 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-config-data\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.574388 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-logs\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.574606 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.574772 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-combined-ca-bundle\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.587460 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-scripts\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.589507 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.592280 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-scripts\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.593662 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.595890 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-config-data\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.596724 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-config\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.597171 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.597777 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.597902 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.598076 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-horizon-secret-key\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.599203 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-config-data\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.599814 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-642d8\" (UniqueName: \"kubernetes.io/projected/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-kube-api-access-642d8\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.614598 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.621021 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cbzn\" (UniqueName: \"kubernetes.io/projected/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-kube-api-access-9cbzn\") pod \"horizon-5d8d7b9c7-kcmvn\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.623702 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pt6x\" (UniqueName: \"kubernetes.io/projected/517ef493-1599-408d-bf6d-0e0eaef4d28c-kube-api-access-9pt6x\") pod \"placement-db-sync-ml9gs\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.628309 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7w9b\" (UniqueName: \"kubernetes.io/projected/b9ace072-49c8-4747-b7d5-1f6f01393c41-kube-api-access-r7w9b\") pod \"dnsmasq-dns-56df8fb6b7-l4k5p\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.637867 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.659159 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.661201 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.677193 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dgswv"] Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.681048 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ml9gs" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.708932 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.725294 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:12:50 crc kubenswrapper[4852]: W1210 12:12:50.777693 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4548a8a_1a72_4ca3_a9b8_b3e2f3769e70.slice/crio-bd64e43ce2f3d0e6991c88c44f8e4e1a3c09c2e6caaf095ba39a4c16e5fe327f WatchSource:0}: Error finding container bd64e43ce2f3d0e6991c88c44f8e4e1a3c09c2e6caaf095ba39a4c16e5fe327f: Status 404 returned error can't find the container with id bd64e43ce2f3d0e6991c88c44f8e4e1a3c09c2e6caaf095ba39a4c16e5fe327f Dec 10 12:12:50 crc kubenswrapper[4852]: I1210 12:12:50.850468 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d6fc7b549-vj5mb"] Dec 10 12:12:50 crc kubenswrapper[4852]: W1210 12:12:50.921056 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8397ac00_7478_4849_900c_cd19b2c83305.slice/crio-12b3494977e5ff2063020f0ac0b8b2234e1760a8b7bf0be753fa3a03396b4106 WatchSource:0}: Error finding container 12b3494977e5ff2063020f0ac0b8b2234e1760a8b7bf0be753fa3a03396b4106: Status 404 returned error can't find the container with id 12b3494977e5ff2063020f0ac0b8b2234e1760a8b7bf0be753fa3a03396b4106 Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.025758 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qvqxs"] Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.100963 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bw8zp"] Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.374307 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qvqxs" event={"ID":"113430a5-0056-4448-918d-e77a8feb53fd","Type":"ContainerStarted","Data":"a0bba005ffdf5ec712b59d4c1ea55ae6093e224ae6d4642a76152003f665a9dd"} Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.374773 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qvqxs" event={"ID":"113430a5-0056-4448-918d-e77a8feb53fd","Type":"ContainerStarted","Data":"fb5b7dbd2a254707bb133faaa114f3429d2d918f6c530e73ee1c33be3e2d9438"} Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.375889 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bw8zp" event={"ID":"5db4372d-41b4-4247-97ab-7f27026c2a82","Type":"ContainerStarted","Data":"8ad4c344ae7d8bb5aec287e20dcc5e5bf5807ec74c444cfd4d5aad94210ad599"} Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.377474 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d6fc7b549-vj5mb" event={"ID":"8397ac00-7478-4849-900c-cd19b2c83305","Type":"ContainerStarted","Data":"12b3494977e5ff2063020f0ac0b8b2234e1760a8b7bf0be753fa3a03396b4106"} Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.394392 4852 generic.go:334] "Generic (PLEG): container finished" podID="c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" containerID="b6813cea2d28b6627a49e30f7141b16d2ee9aad8e9a4aec7c6c4eddd76a27914" exitCode=0 Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.394603 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" podUID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" containerName="dnsmasq-dns" containerID="cri-o://f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4" gracePeriod=10 Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.395267 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" event={"ID":"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70","Type":"ContainerDied","Data":"b6813cea2d28b6627a49e30f7141b16d2ee9aad8e9a4aec7c6c4eddd76a27914"} Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.395311 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" event={"ID":"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70","Type":"ContainerStarted","Data":"bd64e43ce2f3d0e6991c88c44f8e4e1a3c09c2e6caaf095ba39a4c16e5fe327f"} Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.407360 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l4k5p"] Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.417996 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.419306 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qvqxs" podStartSLOduration=2.419284495 podStartE2EDuration="2.419284495s" podCreationTimestamp="2025-12-10 12:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:51.398214189 +0000 UTC m=+1257.483739413" watchObservedRunningTime="2025-12-10 12:12:51.419284495 +0000 UTC m=+1257.504809739" Dec 10 12:12:51 crc kubenswrapper[4852]: W1210 12:12:51.428034 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9ace072_49c8_4747_b7d5_1f6f01393c41.slice/crio-5e9d1e0bc1a45a21e8db4ef09416f28370ed53d92887169f42f216cfe54d11cc WatchSource:0}: Error finding container 5e9d1e0bc1a45a21e8db4ef09416f28370ed53d92887169f42f216cfe54d11cc: Status 404 returned error can't find the container with id 5e9d1e0bc1a45a21e8db4ef09416f28370ed53d92887169f42f216cfe54d11cc Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.599101 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qwzww"] Dec 10 12:12:51 crc kubenswrapper[4852]: W1210 12:12:51.618762 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cc9fcd2_20c3_439e_9b1a_45bc8fb10f8c.slice/crio-203d83cac456628bfde78fb2b5daf30479be5e86ba1441d7fae50722c414ac03 WatchSource:0}: Error finding container 203d83cac456628bfde78fb2b5daf30479be5e86ba1441d7fae50722c414ac03: Status 404 returned error can't find the container with id 203d83cac456628bfde78fb2b5daf30479be5e86ba1441d7fae50722c414ac03 Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.926354 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mwvfs"] Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.952751 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ml9gs"] Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.956175 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:51 crc kubenswrapper[4852]: I1210 12:12:51.960175 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d8d7b9c7-kcmvn"] Dec 10 12:12:51 crc kubenswrapper[4852]: W1210 12:12:51.961246 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71453d1b_e7a6_44a3_a449_b1c10eb76997.slice/crio-8b1e91142767259115ece3aa2dcc77ceefe6ea6de2a28a3c7da1ba3668e801fa WatchSource:0}: Error finding container 8b1e91142767259115ece3aa2dcc77ceefe6ea6de2a28a3c7da1ba3668e801fa: Status 404 returned error can't find the container with id 8b1e91142767259115ece3aa2dcc77ceefe6ea6de2a28a3c7da1ba3668e801fa Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.010016 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-config\") pod \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.010081 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrw8b\" (UniqueName: \"kubernetes.io/projected/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-kube-api-access-mrw8b\") pod \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.010152 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-svc\") pod \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.010261 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-sb\") pod \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.010289 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-swift-storage-0\") pod \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.010314 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-nb\") pod \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\" (UID: \"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.037681 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-kube-api-access-mrw8b" (OuterVolumeSpecName: "kube-api-access-mrw8b") pod "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" (UID: "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081"). InnerVolumeSpecName "kube-api-access-mrw8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: W1210 12:12:52.043644 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod517ef493_1599_408d_bf6d_0e0eaef4d28c.slice/crio-0c6b0aa9969d1daef16fbc43fbfa32ee82f269558a3c2949dd00b1a54243bf79 WatchSource:0}: Error finding container 0c6b0aa9969d1daef16fbc43fbfa32ee82f269558a3c2949dd00b1a54243bf79: Status 404 returned error can't find the container with id 0c6b0aa9969d1daef16fbc43fbfa32ee82f269558a3c2949dd00b1a54243bf79 Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.048899 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.050662 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.111455 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-config\") pod \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.111830 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-sb\") pod \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.111906 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-svc\") pod \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.111941 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-swift-storage-0\") pod \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.112003 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n84xf\" (UniqueName: \"kubernetes.io/projected/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-kube-api-access-n84xf\") pod \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.112031 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-nb\") pod \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\" (UID: \"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70\") " Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.112496 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrw8b\" (UniqueName: \"kubernetes.io/projected/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-kube-api-access-mrw8b\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.124486 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-kube-api-access-n84xf" (OuterVolumeSpecName: "kube-api-access-n84xf") pod "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" (UID: "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70"). InnerVolumeSpecName "kube-api-access-n84xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.128816 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" (UID: "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.130375 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-config" (OuterVolumeSpecName: "config") pod "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" (UID: "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.132131 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" (UID: "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.135803 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" (UID: "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.144382 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" (UID: "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.146185 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-config" (OuterVolumeSpecName: "config") pod "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" (UID: "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.149311 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" (UID: "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.155643 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" (UID: "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.162207 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" (UID: "2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.186191 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" (UID: "c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216636 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216687 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216701 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n84xf\" (UniqueName: \"kubernetes.io/projected/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-kube-api-access-n84xf\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216715 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216729 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216743 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216754 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216765 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216778 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216789 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.216800 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.416109 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qwzww" event={"ID":"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c","Type":"ContainerStarted","Data":"203d83cac456628bfde78fb2b5daf30479be5e86ba1441d7fae50722c414ac03"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.418783 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" event={"ID":"c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70","Type":"ContainerDied","Data":"bd64e43ce2f3d0e6991c88c44f8e4e1a3c09c2e6caaf095ba39a4c16e5fe327f"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.418799 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-dgswv" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.418830 4852 scope.go:117] "RemoveContainer" containerID="b6813cea2d28b6627a49e30f7141b16d2ee9aad8e9a4aec7c6c4eddd76a27914" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.421672 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d8d7b9c7-kcmvn" event={"ID":"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec","Type":"ContainerStarted","Data":"abc7616b8ea40219032b61025f49592c456240ea74613e2e2610cda1361e84f1"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.423421 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mwvfs" event={"ID":"71453d1b-e7a6-44a3-a449-b1c10eb76997","Type":"ContainerStarted","Data":"8b1e91142767259115ece3aa2dcc77ceefe6ea6de2a28a3c7da1ba3668e801fa"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.434106 4852 generic.go:334] "Generic (PLEG): container finished" podID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerID="a2c74957f435247da30c451f17c0da080de3ceff932d1512b66ec724898ba694" exitCode=0 Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.434188 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" event={"ID":"b9ace072-49c8-4747-b7d5-1f6f01393c41","Type":"ContainerDied","Data":"a2c74957f435247da30c451f17c0da080de3ceff932d1512b66ec724898ba694"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.434236 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" event={"ID":"b9ace072-49c8-4747-b7d5-1f6f01393c41","Type":"ContainerStarted","Data":"5e9d1e0bc1a45a21e8db4ef09416f28370ed53d92887169f42f216cfe54d11cc"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.436346 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e80cc962-9e7a-4fc4-8cce-d090c73d5d62","Type":"ContainerStarted","Data":"4a237f9dfd9a5ee91ac9aaec650ac895dcef3e90b54b35da2880edc5b50a5905"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.452555 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bw8zp" event={"ID":"5db4372d-41b4-4247-97ab-7f27026c2a82","Type":"ContainerStarted","Data":"f5a0bcf57bba2b7b080f1839bce4a9e1453126b83a24288123ea4b72d204ad33"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.464835 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dgswv"] Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.471407 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-dgswv"] Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.496556 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ml9gs" event={"ID":"517ef493-1599-408d-bf6d-0e0eaef4d28c","Type":"ContainerStarted","Data":"0c6b0aa9969d1daef16fbc43fbfa32ee82f269558a3c2949dd00b1a54243bf79"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.518167 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02730e24-a11e-4c7b-9470-9290b251bcb9","Type":"ContainerStarted","Data":"9cdef688446cb91b2db69a04d0920b34a65556fe8a0731cc9e278c9bbe28ceae"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.519841 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bw8zp" podStartSLOduration=3.519826044 podStartE2EDuration="3.519826044s" podCreationTimestamp="2025-12-10 12:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:52.497068255 +0000 UTC m=+1258.582593479" watchObservedRunningTime="2025-12-10 12:12:52.519826044 +0000 UTC m=+1258.605351268" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.525697 4852 generic.go:334] "Generic (PLEG): container finished" podID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" containerID="f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4" exitCode=0 Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.526763 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.527314 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" event={"ID":"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081","Type":"ContainerDied","Data":"f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.527343 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-4b9mk" event={"ID":"2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081","Type":"ContainerDied","Data":"f26e10a62fe2667dbed96768b1403caaeb17d330fcd15fcff157557f0ea34d13"} Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.527360 4852 scope.go:117] "RemoveContainer" containerID="f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.560030 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-4b9mk"] Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.568931 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-4b9mk"] Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.624841 4852 scope.go:117] "RemoveContainer" containerID="18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.678076 4852 scope.go:117] "RemoveContainer" containerID="f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4" Dec 10 12:12:52 crc kubenswrapper[4852]: E1210 12:12:52.678671 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4\": container with ID starting with f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4 not found: ID does not exist" containerID="f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.678732 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4"} err="failed to get container status \"f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4\": rpc error: code = NotFound desc = could not find container \"f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4\": container with ID starting with f63b22fe81597f7bf3ef3ca039634b1229fb12dde8675bca294471352c4c22d4 not found: ID does not exist" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.678759 4852 scope.go:117] "RemoveContainer" containerID="18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb" Dec 10 12:12:52 crc kubenswrapper[4852]: E1210 12:12:52.679087 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb\": container with ID starting with 18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb not found: ID does not exist" containerID="18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb" Dec 10 12:12:52 crc kubenswrapper[4852]: I1210 12:12:52.679111 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb"} err="failed to get container status \"18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb\": rpc error: code = NotFound desc = could not find container \"18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb\": container with ID starting with 18e3eac8be0b09da6d089abbcab174707cfe23eff9086393b691549bae5abaeb not found: ID does not exist" Dec 10 12:12:53 crc kubenswrapper[4852]: I1210 12:12:53.163878 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:12:53 crc kubenswrapper[4852]: W1210 12:12:53.178570 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7444e87_8857_4aee_b9c6_5797c8203b8d.slice/crio-85e54e73ae4ec0f94892f3752230014e8075a9498d87996c20142664d4d5559d WatchSource:0}: Error finding container 85e54e73ae4ec0f94892f3752230014e8075a9498d87996c20142664d4d5559d: Status 404 returned error can't find the container with id 85e54e73ae4ec0f94892f3752230014e8075a9498d87996c20142664d4d5559d Dec 10 12:12:53 crc kubenswrapper[4852]: I1210 12:12:53.542074 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e80cc962-9e7a-4fc4-8cce-d090c73d5d62","Type":"ContainerStarted","Data":"444d212e10a7e6f09cc349c51798832938114b576adb0b464ec0e78b66bd679b"} Dec 10 12:12:53 crc kubenswrapper[4852]: I1210 12:12:53.573600 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7444e87-8857-4aee-b9c6-5797c8203b8d","Type":"ContainerStarted","Data":"85e54e73ae4ec0f94892f3752230014e8075a9498d87996c20142664d4d5559d"} Dec 10 12:12:53 crc kubenswrapper[4852]: I1210 12:12:53.656901 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" event={"ID":"b9ace072-49c8-4747-b7d5-1f6f01393c41","Type":"ContainerStarted","Data":"fbaeecc7a4f86216c08b7503d4123c044867a74eaa0a9b490920f6f46493594e"} Dec 10 12:12:53 crc kubenswrapper[4852]: I1210 12:12:53.687905 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" podStartSLOduration=3.68788645 podStartE2EDuration="3.68788645s" podCreationTimestamp="2025-12-10 12:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:53.68348498 +0000 UTC m=+1259.769010204" watchObservedRunningTime="2025-12-10 12:12:53.68788645 +0000 UTC m=+1259.773411684" Dec 10 12:12:54 crc kubenswrapper[4852]: I1210 12:12:54.202374 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" path="/var/lib/kubelet/pods/2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081/volumes" Dec 10 12:12:54 crc kubenswrapper[4852]: I1210 12:12:54.202936 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" path="/var/lib/kubelet/pods/c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70/volumes" Dec 10 12:12:54 crc kubenswrapper[4852]: I1210 12:12:54.668757 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.125173 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.173346 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d6fc7b549-vj5mb"] Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.222884 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c94757ccc-jfgtq"] Dec 10 12:12:55 crc kubenswrapper[4852]: E1210 12:12:55.223295 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" containerName="init" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.223307 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" containerName="init" Dec 10 12:12:55 crc kubenswrapper[4852]: E1210 12:12:55.223321 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" containerName="dnsmasq-dns" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.223327 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" containerName="dnsmasq-dns" Dec 10 12:12:55 crc kubenswrapper[4852]: E1210 12:12:55.223334 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" containerName="init" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.223341 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" containerName="init" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.223494 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a3fd8e9-e2f9-4a5f-a97a-3fcbd8003081" containerName="dnsmasq-dns" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.223515 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4548a8a-1a72-4ca3-a9b8-b3e2f3769e70" containerName="init" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.224444 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.229765 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.251833 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c94757ccc-jfgtq"] Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.279918 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.280940 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-scripts\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.280996 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pn5d\" (UniqueName: \"kubernetes.io/projected/fdd39a7d-3499-4846-82b1-452bd627dd23-kube-api-access-4pn5d\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.281048 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdd39a7d-3499-4846-82b1-452bd627dd23-horizon-secret-key\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.281092 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-config-data\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.281115 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd39a7d-3499-4846-82b1-452bd627dd23-logs\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.382761 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-scripts\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.382881 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pn5d\" (UniqueName: \"kubernetes.io/projected/fdd39a7d-3499-4846-82b1-452bd627dd23-kube-api-access-4pn5d\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.383694 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdd39a7d-3499-4846-82b1-452bd627dd23-horizon-secret-key\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.383725 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-scripts\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.384459 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-config-data\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.384500 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd39a7d-3499-4846-82b1-452bd627dd23-logs\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.384885 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd39a7d-3499-4846-82b1-452bd627dd23-logs\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.386303 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-config-data\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.401321 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdd39a7d-3499-4846-82b1-452bd627dd23-horizon-secret-key\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.401527 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pn5d\" (UniqueName: \"kubernetes.io/projected/fdd39a7d-3499-4846-82b1-452bd627dd23-kube-api-access-4pn5d\") pod \"horizon-6c94757ccc-jfgtq\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:55 crc kubenswrapper[4852]: I1210 12:12:55.540990 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:12:56 crc kubenswrapper[4852]: W1210 12:12:56.013181 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdd39a7d_3499_4846_82b1_452bd627dd23.slice/crio-f2dd917be3d8fa042acf5c634fa82cdbc02cb741c589466be27689098064dd1d WatchSource:0}: Error finding container f2dd917be3d8fa042acf5c634fa82cdbc02cb741c589466be27689098064dd1d: Status 404 returned error can't find the container with id f2dd917be3d8fa042acf5c634fa82cdbc02cb741c589466be27689098064dd1d Dec 10 12:12:56 crc kubenswrapper[4852]: I1210 12:12:56.013732 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c94757ccc-jfgtq"] Dec 10 12:12:56 crc kubenswrapper[4852]: I1210 12:12:56.690386 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c94757ccc-jfgtq" event={"ID":"fdd39a7d-3499-4846-82b1-452bd627dd23","Type":"ContainerStarted","Data":"f2dd917be3d8fa042acf5c634fa82cdbc02cb741c589466be27689098064dd1d"} Dec 10 12:12:56 crc kubenswrapper[4852]: I1210 12:12:56.693137 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e80cc962-9e7a-4fc4-8cce-d090c73d5d62","Type":"ContainerStarted","Data":"e9e6ef3810af831e83bf21a4686a03d3d33b97e0e0a95242e23814debb6d046c"} Dec 10 12:12:57 crc kubenswrapper[4852]: I1210 12:12:57.720184 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerName="glance-log" containerID="cri-o://444d212e10a7e6f09cc349c51798832938114b576adb0b464ec0e78b66bd679b" gracePeriod=30 Dec 10 12:12:57 crc kubenswrapper[4852]: I1210 12:12:57.720256 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7444e87-8857-4aee-b9c6-5797c8203b8d","Type":"ContainerStarted","Data":"543624c02fa43b3688dd99e6eb961fbd2c189a47c98cebe3b1384eb4d096987c"} Dec 10 12:12:57 crc kubenswrapper[4852]: I1210 12:12:57.720337 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerName="glance-httpd" containerID="cri-o://e9e6ef3810af831e83bf21a4686a03d3d33b97e0e0a95242e23814debb6d046c" gracePeriod=30 Dec 10 12:12:57 crc kubenswrapper[4852]: I1210 12:12:57.751260 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.751242751 podStartE2EDuration="7.751242751s" podCreationTimestamp="2025-12-10 12:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:57.740527444 +0000 UTC m=+1263.826052668" watchObservedRunningTime="2025-12-10 12:12:57.751242751 +0000 UTC m=+1263.836767975" Dec 10 12:12:58 crc kubenswrapper[4852]: I1210 12:12:58.734300 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7444e87-8857-4aee-b9c6-5797c8203b8d","Type":"ContainerStarted","Data":"9d2c597766444d2c6afca17c504b1f51e3ef2296956e65af43b8625ab0d52f6f"} Dec 10 12:12:58 crc kubenswrapper[4852]: I1210 12:12:58.734446 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerName="glance-log" containerID="cri-o://543624c02fa43b3688dd99e6eb961fbd2c189a47c98cebe3b1384eb4d096987c" gracePeriod=30 Dec 10 12:12:58 crc kubenswrapper[4852]: I1210 12:12:58.734632 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerName="glance-httpd" containerID="cri-o://9d2c597766444d2c6afca17c504b1f51e3ef2296956e65af43b8625ab0d52f6f" gracePeriod=30 Dec 10 12:12:58 crc kubenswrapper[4852]: I1210 12:12:58.738605 4852 generic.go:334] "Generic (PLEG): container finished" podID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerID="e9e6ef3810af831e83bf21a4686a03d3d33b97e0e0a95242e23814debb6d046c" exitCode=0 Dec 10 12:12:58 crc kubenswrapper[4852]: I1210 12:12:58.738638 4852 generic.go:334] "Generic (PLEG): container finished" podID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerID="444d212e10a7e6f09cc349c51798832938114b576adb0b464ec0e78b66bd679b" exitCode=143 Dec 10 12:12:58 crc kubenswrapper[4852]: I1210 12:12:58.738650 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e80cc962-9e7a-4fc4-8cce-d090c73d5d62","Type":"ContainerDied","Data":"e9e6ef3810af831e83bf21a4686a03d3d33b97e0e0a95242e23814debb6d046c"} Dec 10 12:12:58 crc kubenswrapper[4852]: I1210 12:12:58.738726 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e80cc962-9e7a-4fc4-8cce-d090c73d5d62","Type":"ContainerDied","Data":"444d212e10a7e6f09cc349c51798832938114b576adb0b464ec0e78b66bd679b"} Dec 10 12:12:58 crc kubenswrapper[4852]: I1210 12:12:58.764134 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.764112408999999 podStartE2EDuration="8.764112409s" podCreationTimestamp="2025-12-10 12:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:12:58.755405172 +0000 UTC m=+1264.840930406" watchObservedRunningTime="2025-12-10 12:12:58.764112409 +0000 UTC m=+1264.849637643" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.076297 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.169361 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-config-data\") pod \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.169468 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-642d8\" (UniqueName: \"kubernetes.io/projected/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-kube-api-access-642d8\") pod \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.169499 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-httpd-run\") pod \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.169525 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.169596 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-scripts\") pod \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.169650 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-combined-ca-bundle\") pod \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.169715 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-logs\") pod \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\" (UID: \"e80cc962-9e7a-4fc4-8cce-d090c73d5d62\") " Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.170296 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-logs" (OuterVolumeSpecName: "logs") pod "e80cc962-9e7a-4fc4-8cce-d090c73d5d62" (UID: "e80cc962-9e7a-4fc4-8cce-d090c73d5d62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.170603 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e80cc962-9e7a-4fc4-8cce-d090c73d5d62" (UID: "e80cc962-9e7a-4fc4-8cce-d090c73d5d62"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.176614 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-kube-api-access-642d8" (OuterVolumeSpecName: "kube-api-access-642d8") pod "e80cc962-9e7a-4fc4-8cce-d090c73d5d62" (UID: "e80cc962-9e7a-4fc4-8cce-d090c73d5d62"). InnerVolumeSpecName "kube-api-access-642d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.185424 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-scripts" (OuterVolumeSpecName: "scripts") pod "e80cc962-9e7a-4fc4-8cce-d090c73d5d62" (UID: "e80cc962-9e7a-4fc4-8cce-d090c73d5d62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.186485 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e80cc962-9e7a-4fc4-8cce-d090c73d5d62" (UID: "e80cc962-9e7a-4fc4-8cce-d090c73d5d62"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.196423 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e80cc962-9e7a-4fc4-8cce-d090c73d5d62" (UID: "e80cc962-9e7a-4fc4-8cce-d090c73d5d62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.213061 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-config-data" (OuterVolumeSpecName: "config-data") pod "e80cc962-9e7a-4fc4-8cce-d090c73d5d62" (UID: "e80cc962-9e7a-4fc4-8cce-d090c73d5d62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.272884 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.272934 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.272949 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.272961 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-642d8\" (UniqueName: \"kubernetes.io/projected/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-kube-api-access-642d8\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.272973 4852 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.273002 4852 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.273011 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e80cc962-9e7a-4fc4-8cce-d090c73d5d62-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.299251 4852 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.374639 4852 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.752166 4852 generic.go:334] "Generic (PLEG): container finished" podID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerID="543624c02fa43b3688dd99e6eb961fbd2c189a47c98cebe3b1384eb4d096987c" exitCode=143 Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.752268 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7444e87-8857-4aee-b9c6-5797c8203b8d","Type":"ContainerDied","Data":"543624c02fa43b3688dd99e6eb961fbd2c189a47c98cebe3b1384eb4d096987c"} Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.754563 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e80cc962-9e7a-4fc4-8cce-d090c73d5d62","Type":"ContainerDied","Data":"4a237f9dfd9a5ee91ac9aaec650ac895dcef3e90b54b35da2880edc5b50a5905"} Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.754607 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.754615 4852 scope.go:117] "RemoveContainer" containerID="e9e6ef3810af831e83bf21a4686a03d3d33b97e0e0a95242e23814debb6d046c" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.787907 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.795682 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.815499 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:12:59 crc kubenswrapper[4852]: E1210 12:12:59.815876 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerName="glance-log" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.815896 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerName="glance-log" Dec 10 12:12:59 crc kubenswrapper[4852]: E1210 12:12:59.815930 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerName="glance-httpd" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.815936 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerName="glance-httpd" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.816107 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerName="glance-httpd" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.816131 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" containerName="glance-log" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.818605 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.822425 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.822814 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.830208 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.886021 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.886099 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.886318 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-config-data\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.886381 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-scripts\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.886458 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-logs\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.886619 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.886745 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.886808 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnfv\" (UniqueName: \"kubernetes.io/projected/cef7f8f3-6f82-406b-bce2-e127bcd7546b-kube-api-access-5jnfv\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.988300 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-config-data\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.988538 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-scripts\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.988569 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-logs\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.988637 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.988687 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.988722 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnfv\" (UniqueName: \"kubernetes.io/projected/cef7f8f3-6f82-406b-bce2-e127bcd7546b-kube-api-access-5jnfv\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.988767 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.988785 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.989835 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.993697 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:12:59 crc kubenswrapper[4852]: I1210 12:12:59.996441 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-config-data\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.002739 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-logs\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.010533 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.011828 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.052010 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-scripts\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.056921 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnfv\" (UniqueName: \"kubernetes.io/projected/cef7f8f3-6f82-406b-bce2-e127bcd7546b-kube-api-access-5jnfv\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.096150 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.186870 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80cc962-9e7a-4fc4-8cce-d090c73d5d62" path="/var/lib/kubelet/pods/e80cc962-9e7a-4fc4-8cce-d090c73d5d62/volumes" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.204987 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.711145 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.780416 4852 generic.go:334] "Generic (PLEG): container finished" podID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerID="9d2c597766444d2c6afca17c504b1f51e3ef2296956e65af43b8625ab0d52f6f" exitCode=0 Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.780463 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7444e87-8857-4aee-b9c6-5797c8203b8d","Type":"ContainerDied","Data":"9d2c597766444d2c6afca17c504b1f51e3ef2296956e65af43b8625ab0d52f6f"} Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.791770 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-cfm8q"] Dec 10 12:13:00 crc kubenswrapper[4852]: I1210 12:13:00.792027 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" containerID="cri-o://dd6985a6e7fa19d43bfb587d3319fece92772e7d9aa409320314fa5f34730405" gracePeriod=10 Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.295362 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d8d7b9c7-kcmvn"] Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.334060 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.350348 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-955f9866d-84pn5"] Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.352386 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.358738 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.366038 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-955f9866d-84pn5"] Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.410679 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c94757ccc-jfgtq"] Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.424181 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-combined-ca-bundle\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.424269 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-scripts\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.424372 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmggl\" (UniqueName: \"kubernetes.io/projected/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-kube-api-access-fmggl\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.424409 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-secret-key\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.424455 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-tls-certs\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.424485 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-config-data\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.424514 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-logs\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.455564 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-cc955f7d4-bclr7"] Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.457408 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.475914 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cc955f7d4-bclr7"] Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.525851 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmggl\" (UniqueName: \"kubernetes.io/projected/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-kube-api-access-fmggl\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.525924 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-secret-key\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.525978 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-tls-certs\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.526009 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-config-data\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.526039 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-logs\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.526096 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-combined-ca-bundle\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.526130 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-scripts\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.527101 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-scripts\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.528743 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-config-data\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.528947 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-logs\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.534533 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-secret-key\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.534628 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-tls-certs\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.535166 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-combined-ca-bundle\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.561411 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmggl\" (UniqueName: \"kubernetes.io/projected/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-kube-api-access-fmggl\") pod \"horizon-955f9866d-84pn5\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.628333 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-combined-ca-bundle\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.628647 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-horizon-secret-key\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.628781 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-logs\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.628877 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-horizon-tls-certs\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.629036 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgbn\" (UniqueName: \"kubernetes.io/projected/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-kube-api-access-fdgbn\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.629170 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-scripts\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.629295 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-config-data\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.724648 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.730171 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-horizon-secret-key\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.730246 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-logs\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.730270 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-horizon-tls-certs\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.730297 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgbn\" (UniqueName: \"kubernetes.io/projected/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-kube-api-access-fdgbn\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.730350 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-scripts\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.730375 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-config-data\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.730397 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-combined-ca-bundle\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.731256 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-scripts\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.731276 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-logs\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.732451 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-config-data\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.735634 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-combined-ca-bundle\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.735839 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-horizon-secret-key\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.736026 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-horizon-tls-certs\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.745195 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgbn\" (UniqueName: \"kubernetes.io/projected/35b770c5-bcea-4f68-8c5b-fb852f8b97a9-kube-api-access-fdgbn\") pod \"horizon-cc955f7d4-bclr7\" (UID: \"35b770c5-bcea-4f68-8c5b-fb852f8b97a9\") " pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.776535 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.796002 4852 generic.go:334] "Generic (PLEG): container finished" podID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerID="dd6985a6e7fa19d43bfb587d3319fece92772e7d9aa409320314fa5f34730405" exitCode=0 Dec 10 12:13:01 crc kubenswrapper[4852]: I1210 12:13:01.796069 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" event={"ID":"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe","Type":"ContainerDied","Data":"dd6985a6e7fa19d43bfb587d3319fece92772e7d9aa409320314fa5f34730405"} Dec 10 12:13:02 crc kubenswrapper[4852]: I1210 12:13:02.804578 4852 generic.go:334] "Generic (PLEG): container finished" podID="113430a5-0056-4448-918d-e77a8feb53fd" containerID="a0bba005ffdf5ec712b59d4c1ea55ae6093e224ae6d4642a76152003f665a9dd" exitCode=0 Dec 10 12:13:02 crc kubenswrapper[4852]: I1210 12:13:02.804673 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qvqxs" event={"ID":"113430a5-0056-4448-918d-e77a8feb53fd","Type":"ContainerDied","Data":"a0bba005ffdf5ec712b59d4c1ea55ae6093e224ae6d4642a76152003f665a9dd"} Dec 10 12:13:03 crc kubenswrapper[4852]: I1210 12:13:03.428948 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Dec 10 12:13:08 crc kubenswrapper[4852]: I1210 12:13:08.428783 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.006361 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.144762 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-logs\") pod \"f7444e87-8857-4aee-b9c6-5797c8203b8d\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.145114 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f7444e87-8857-4aee-b9c6-5797c8203b8d\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.145747 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-logs" (OuterVolumeSpecName: "logs") pod "f7444e87-8857-4aee-b9c6-5797c8203b8d" (UID: "f7444e87-8857-4aee-b9c6-5797c8203b8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.146664 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-combined-ca-bundle\") pod \"f7444e87-8857-4aee-b9c6-5797c8203b8d\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.146719 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtc2\" (UniqueName: \"kubernetes.io/projected/f7444e87-8857-4aee-b9c6-5797c8203b8d-kube-api-access-zrtc2\") pod \"f7444e87-8857-4aee-b9c6-5797c8203b8d\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.146759 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-config-data\") pod \"f7444e87-8857-4aee-b9c6-5797c8203b8d\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.146804 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-httpd-run\") pod \"f7444e87-8857-4aee-b9c6-5797c8203b8d\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.146831 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-scripts\") pod \"f7444e87-8857-4aee-b9c6-5797c8203b8d\" (UID: \"f7444e87-8857-4aee-b9c6-5797c8203b8d\") " Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.147129 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.147909 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f7444e87-8857-4aee-b9c6-5797c8203b8d" (UID: "f7444e87-8857-4aee-b9c6-5797c8203b8d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.152656 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-scripts" (OuterVolumeSpecName: "scripts") pod "f7444e87-8857-4aee-b9c6-5797c8203b8d" (UID: "f7444e87-8857-4aee-b9c6-5797c8203b8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.154305 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "f7444e87-8857-4aee-b9c6-5797c8203b8d" (UID: "f7444e87-8857-4aee-b9c6-5797c8203b8d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.156151 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7444e87-8857-4aee-b9c6-5797c8203b8d-kube-api-access-zrtc2" (OuterVolumeSpecName: "kube-api-access-zrtc2") pod "f7444e87-8857-4aee-b9c6-5797c8203b8d" (UID: "f7444e87-8857-4aee-b9c6-5797c8203b8d"). InnerVolumeSpecName "kube-api-access-zrtc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.179204 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7444e87-8857-4aee-b9c6-5797c8203b8d" (UID: "f7444e87-8857-4aee-b9c6-5797c8203b8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.201474 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-config-data" (OuterVolumeSpecName: "config-data") pod "f7444e87-8857-4aee-b9c6-5797c8203b8d" (UID: "f7444e87-8857-4aee-b9c6-5797c8203b8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.248420 4852 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.248456 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.248477 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrtc2\" (UniqueName: \"kubernetes.io/projected/f7444e87-8857-4aee-b9c6-5797c8203b8d-kube-api-access-zrtc2\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.248490 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.248500 4852 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7444e87-8857-4aee-b9c6-5797c8203b8d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.248511 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7444e87-8857-4aee-b9c6-5797c8203b8d-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.273945 4852 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.349524 4852 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.879304 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7444e87-8857-4aee-b9c6-5797c8203b8d","Type":"ContainerDied","Data":"85e54e73ae4ec0f94892f3752230014e8075a9498d87996c20142664d4d5559d"} Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.879429 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.922471 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.929024 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.961658 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:13:11 crc kubenswrapper[4852]: E1210 12:13:11.962179 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerName="glance-log" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.962199 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerName="glance-log" Dec 10 12:13:11 crc kubenswrapper[4852]: E1210 12:13:11.962214 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerName="glance-httpd" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.962222 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerName="glance-httpd" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.962464 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerName="glance-httpd" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.962498 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" containerName="glance-log" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.963815 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.970355 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.971481 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 10 12:13:11 crc kubenswrapper[4852]: I1210 12:13:11.983987 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.163408 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.163465 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.163485 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-logs\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.163505 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.163576 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jhrh\" (UniqueName: \"kubernetes.io/projected/e410ffd3-2d2c-4665-95fd-e20c287c3151-kube-api-access-4jhrh\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.163610 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.163646 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.163683 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.189636 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7444e87-8857-4aee-b9c6-5797c8203b8d" path="/var/lib/kubelet/pods/f7444e87-8857-4aee-b9c6-5797c8203b8d/volumes" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.265878 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.265948 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.265987 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-logs\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.266016 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.266091 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jhrh\" (UniqueName: \"kubernetes.io/projected/e410ffd3-2d2c-4665-95fd-e20c287c3151-kube-api-access-4jhrh\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.266120 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.266174 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.266228 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.267425 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.267689 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-logs\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.271907 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.275407 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.275824 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.279161 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.287658 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.291430 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jhrh\" (UniqueName: \"kubernetes.io/projected/e410ffd3-2d2c-4665-95fd-e20c287c3151-kube-api-access-4jhrh\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.309646 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:13:12 crc kubenswrapper[4852]: I1210 12:13:12.602257 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:15 crc kubenswrapper[4852]: E1210 12:13:15.774208 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 10 12:13:15 crc kubenswrapper[4852]: E1210 12:13:15.774698 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n89h5c6h68h77h5h56bh656h684hb5h64bh5ddh5c4h67fh67ch55fh59bh689hdfh659h58fh544h7ch5cdh57fh586h658h5dch579h667h667h697h67bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9cbzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5d8d7b9c7-kcmvn_openstack(85248dbd-6fba-48bf-9c4e-cb37fd6d12ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:13:15 crc kubenswrapper[4852]: E1210 12:13:15.780559 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5d8d7b9c7-kcmvn" podUID="85248dbd-6fba-48bf-9c4e-cb37fd6d12ec" Dec 10 12:13:15 crc kubenswrapper[4852]: I1210 12:13:15.790337 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:13:15 crc kubenswrapper[4852]: I1210 12:13:15.790405 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:13:15 crc kubenswrapper[4852]: I1210 12:13:15.790458 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:13:15 crc kubenswrapper[4852]: I1210 12:13:15.791276 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"15e58a6d5758dde8e8be6570ea8629914b8054e6378a86d3d8b1552b7be80d78"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:13:15 crc kubenswrapper[4852]: I1210 12:13:15.791353 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://15e58a6d5758dde8e8be6570ea8629914b8054e6378a86d3d8b1552b7be80d78" gracePeriod=600 Dec 10 12:13:16 crc kubenswrapper[4852]: I1210 12:13:16.920813 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="15e58a6d5758dde8e8be6570ea8629914b8054e6378a86d3d8b1552b7be80d78" exitCode=0 Dec 10 12:13:16 crc kubenswrapper[4852]: I1210 12:13:16.920888 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"15e58a6d5758dde8e8be6570ea8629914b8054e6378a86d3d8b1552b7be80d78"} Dec 10 12:13:17 crc kubenswrapper[4852]: E1210 12:13:17.893218 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 10 12:13:17 crc kubenswrapper[4852]: E1210 12:13:17.893438 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n74h688h5d5h76h5c5hbbh59bh5cfh658h5c6h588h587hfh695h685h96h647h5ddh56ch5cdh5d7hf4h68hc6h5c6h5ch654h65ch5c9h597h6bh5bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2tvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7d6fc7b549-vj5mb_openstack(8397ac00-7478-4849-900c-cd19b2c83305): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:13:17 crc kubenswrapper[4852]: E1210 12:13:17.896890 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7d6fc7b549-vj5mb" podUID="8397ac00-7478-4849-900c-cd19b2c83305" Dec 10 12:13:18 crc kubenswrapper[4852]: I1210 12:13:18.428531 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 10 12:13:18 crc kubenswrapper[4852]: I1210 12:13:18.429145 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:13:23 crc kubenswrapper[4852]: I1210 12:13:23.429748 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 10 12:13:28 crc kubenswrapper[4852]: I1210 12:13:28.431268 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.667915 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.789784 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-config-data\") pod \"113430a5-0056-4448-918d-e77a8feb53fd\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.790100 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-credential-keys\") pod \"113430a5-0056-4448-918d-e77a8feb53fd\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.790195 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-fernet-keys\") pod \"113430a5-0056-4448-918d-e77a8feb53fd\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.790326 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-combined-ca-bundle\") pod \"113430a5-0056-4448-918d-e77a8feb53fd\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.790350 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-scripts\") pod \"113430a5-0056-4448-918d-e77a8feb53fd\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.790374 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t6qw\" (UniqueName: \"kubernetes.io/projected/113430a5-0056-4448-918d-e77a8feb53fd-kube-api-access-2t6qw\") pod \"113430a5-0056-4448-918d-e77a8feb53fd\" (UID: \"113430a5-0056-4448-918d-e77a8feb53fd\") " Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.796546 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "113430a5-0056-4448-918d-e77a8feb53fd" (UID: "113430a5-0056-4448-918d-e77a8feb53fd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.797170 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-scripts" (OuterVolumeSpecName: "scripts") pod "113430a5-0056-4448-918d-e77a8feb53fd" (UID: "113430a5-0056-4448-918d-e77a8feb53fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.799595 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113430a5-0056-4448-918d-e77a8feb53fd-kube-api-access-2t6qw" (OuterVolumeSpecName: "kube-api-access-2t6qw") pod "113430a5-0056-4448-918d-e77a8feb53fd" (UID: "113430a5-0056-4448-918d-e77a8feb53fd"). InnerVolumeSpecName "kube-api-access-2t6qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.799744 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "113430a5-0056-4448-918d-e77a8feb53fd" (UID: "113430a5-0056-4448-918d-e77a8feb53fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.821446 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-config-data" (OuterVolumeSpecName: "config-data") pod "113430a5-0056-4448-918d-e77a8feb53fd" (UID: "113430a5-0056-4448-918d-e77a8feb53fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.822722 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "113430a5-0056-4448-918d-e77a8feb53fd" (UID: "113430a5-0056-4448-918d-e77a8feb53fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.893123 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.893178 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t6qw\" (UniqueName: \"kubernetes.io/projected/113430a5-0056-4448-918d-e77a8feb53fd-kube-api-access-2t6qw\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.893194 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.893204 4852 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.893219 4852 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:30 crc kubenswrapper[4852]: I1210 12:13:30.893256 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113430a5-0056-4448-918d-e77a8feb53fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:31 crc kubenswrapper[4852]: I1210 12:13:31.030454 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qvqxs" event={"ID":"113430a5-0056-4448-918d-e77a8feb53fd","Type":"ContainerDied","Data":"fb5b7dbd2a254707bb133faaa114f3429d2d918f6c530e73ee1c33be3e2d9438"} Dec 10 12:13:31 crc kubenswrapper[4852]: I1210 12:13:31.030496 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb5b7dbd2a254707bb133faaa114f3429d2d918f6c530e73ee1c33be3e2d9438" Dec 10 12:13:31 crc kubenswrapper[4852]: I1210 12:13:31.030527 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qvqxs" Dec 10 12:13:31 crc kubenswrapper[4852]: E1210 12:13:31.758339 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 10 12:13:31 crc kubenswrapper[4852]: E1210 12:13:31.758532 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wsdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qwzww_openstack(8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:13:31 crc kubenswrapper[4852]: E1210 12:13:31.759820 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qwzww" podUID="8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" Dec 10 12:13:31 crc kubenswrapper[4852]: I1210 12:13:31.765481 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qvqxs"] Dec 10 12:13:31 crc kubenswrapper[4852]: I1210 12:13:31.773797 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qvqxs"] Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.843076 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.852081 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.897783 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-l7ltl"] Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:31.898622 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="init" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.898640 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="init" Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:31.898655 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113430a5-0056-4448-918d-e77a8feb53fd" containerName="keystone-bootstrap" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.898664 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="113430a5-0056-4448-918d-e77a8feb53fd" containerName="keystone-bootstrap" Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:31.898770 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.898781 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.899036 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.899074 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="113430a5-0056-4448-918d-e77a8feb53fd" containerName="keystone-bootstrap" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.900738 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.902650 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rs7cl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.902897 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.903371 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.903788 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.904266 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:31.911430 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l7ltl"] Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.020869 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-logs\") pod \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.020915 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-scripts\") pod \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.020970 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-dns-svc\") pod \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.020992 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-nb\") pod \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021016 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-config-data\") pod \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021037 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-horizon-secret-key\") pod \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021067 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cbzn\" (UniqueName: \"kubernetes.io/projected/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-kube-api-access-9cbzn\") pod \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\" (UID: \"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021089 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-sb\") pod \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021151 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pz6l\" (UniqueName: \"kubernetes.io/projected/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-kube-api-access-6pz6l\") pod \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021172 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-config\") pod \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\" (UID: \"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021316 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-logs" (OuterVolumeSpecName: "logs") pod "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec" (UID: "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021426 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-combined-ca-bundle\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021462 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-credential-keys\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021505 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-fernet-keys\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021526 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-config-data\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021570 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-scripts\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021600 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqqcz\" (UniqueName: \"kubernetes.io/projected/b0bf8197-7342-4289-ae56-f606b479778b-kube-api-access-mqqcz\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.021674 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.022073 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-scripts" (OuterVolumeSpecName: "scripts") pod "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec" (UID: "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.022924 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-config-data" (OuterVolumeSpecName: "config-data") pod "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec" (UID: "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.024680 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-kube-api-access-9cbzn" (OuterVolumeSpecName: "kube-api-access-9cbzn") pod "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec" (UID: "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec"). InnerVolumeSpecName "kube-api-access-9cbzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.026718 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec" (UID: "85248dbd-6fba-48bf-9c4e-cb37fd6d12ec"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.028780 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-kube-api-access-6pz6l" (OuterVolumeSpecName: "kube-api-access-6pz6l") pod "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" (UID: "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe"). InnerVolumeSpecName "kube-api-access-6pz6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.044955 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.044963 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" event={"ID":"8e1d381e-04cb-4b4f-bca5-d091b0b10dbe","Type":"ContainerDied","Data":"b5a6fb2931f3e332fcca6ed8a268cd06d53ffa4a8dc9b4953c65facd24051092"} Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.047008 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8d7b9c7-kcmvn" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.047001 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d8d7b9c7-kcmvn" event={"ID":"85248dbd-6fba-48bf-9c4e-cb37fd6d12ec","Type":"ContainerDied","Data":"abc7616b8ea40219032b61025f49592c456240ea74613e2e2610cda1361e84f1"} Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:32.048599 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qwzww" podUID="8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.078042 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" (UID: "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.080850 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" (UID: "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.087015 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" (UID: "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.105634 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-config" (OuterVolumeSpecName: "config") pod "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" (UID: "8e1d381e-04cb-4b4f-bca5-d091b0b10dbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123352 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-scripts\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123403 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqqcz\" (UniqueName: \"kubernetes.io/projected/b0bf8197-7342-4289-ae56-f606b479778b-kube-api-access-mqqcz\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123464 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-combined-ca-bundle\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123494 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-credential-keys\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123534 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-fernet-keys\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123555 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-config-data\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123606 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123617 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123625 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123635 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123643 4852 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123652 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cbzn\" (UniqueName: \"kubernetes.io/projected/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec-kube-api-access-9cbzn\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123660 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123668 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pz6l\" (UniqueName: \"kubernetes.io/projected/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-kube-api-access-6pz6l\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.123677 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.126942 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-scripts\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.127104 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-combined-ca-bundle\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.128259 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-config-data\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.128310 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-credential-keys\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.129669 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-fernet-keys\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.141027 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqqcz\" (UniqueName: \"kubernetes.io/projected/b0bf8197-7342-4289-ae56-f606b479778b-kube-api-access-mqqcz\") pod \"keystone-bootstrap-l7ltl\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.192392 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113430a5-0056-4448-918d-e77a8feb53fd" path="/var/lib/kubelet/pods/113430a5-0056-4448-918d-e77a8feb53fd/volumes" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.226893 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.255666 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d8d7b9c7-kcmvn"] Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.261762 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d8d7b9c7-kcmvn"] Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.368251 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-cfm8q"] Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.375721 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-cfm8q"] Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.667396 4852 scope.go:117] "RemoveContainer" containerID="444d212e10a7e6f09cc349c51798832938114b576adb0b464ec0e78b66bd679b" Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:32.678442 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:32.678663 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dpfpb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mwvfs_openstack(71453d1b-e7a6-44a3-a449-b1c10eb76997): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:32.680368 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mwvfs" podUID="71453d1b-e7a6-44a3-a449-b1c10eb76997" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.686778 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.837488 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2tvf\" (UniqueName: \"kubernetes.io/projected/8397ac00-7478-4849-900c-cd19b2c83305-kube-api-access-x2tvf\") pod \"8397ac00-7478-4849-900c-cd19b2c83305\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.837557 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397ac00-7478-4849-900c-cd19b2c83305-logs\") pod \"8397ac00-7478-4849-900c-cd19b2c83305\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.837657 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-scripts\") pod \"8397ac00-7478-4849-900c-cd19b2c83305\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.837729 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-config-data\") pod \"8397ac00-7478-4849-900c-cd19b2c83305\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.837756 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8397ac00-7478-4849-900c-cd19b2c83305-horizon-secret-key\") pod \"8397ac00-7478-4849-900c-cd19b2c83305\" (UID: \"8397ac00-7478-4849-900c-cd19b2c83305\") " Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.837935 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8397ac00-7478-4849-900c-cd19b2c83305-logs" (OuterVolumeSpecName: "logs") pod "8397ac00-7478-4849-900c-cd19b2c83305" (UID: "8397ac00-7478-4849-900c-cd19b2c83305"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.838320 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8397ac00-7478-4849-900c-cd19b2c83305-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.838577 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-scripts" (OuterVolumeSpecName: "scripts") pod "8397ac00-7478-4849-900c-cd19b2c83305" (UID: "8397ac00-7478-4849-900c-cd19b2c83305"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.838703 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-config-data" (OuterVolumeSpecName: "config-data") pod "8397ac00-7478-4849-900c-cd19b2c83305" (UID: "8397ac00-7478-4849-900c-cd19b2c83305"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.842096 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8397ac00-7478-4849-900c-cd19b2c83305-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8397ac00-7478-4849-900c-cd19b2c83305" (UID: "8397ac00-7478-4849-900c-cd19b2c83305"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.842801 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8397ac00-7478-4849-900c-cd19b2c83305-kube-api-access-x2tvf" (OuterVolumeSpecName: "kube-api-access-x2tvf") pod "8397ac00-7478-4849-900c-cd19b2c83305" (UID: "8397ac00-7478-4849-900c-cd19b2c83305"). InnerVolumeSpecName "kube-api-access-x2tvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.939767 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2tvf\" (UniqueName: \"kubernetes.io/projected/8397ac00-7478-4849-900c-cd19b2c83305-kube-api-access-x2tvf\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.939804 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.939816 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8397ac00-7478-4849-900c-cd19b2c83305-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:32.939828 4852 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8397ac00-7478-4849-900c-cd19b2c83305-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:33.060617 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d6fc7b549-vj5mb" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:33.060919 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d6fc7b549-vj5mb" event={"ID":"8397ac00-7478-4849-900c-cd19b2c83305","Type":"ContainerDied","Data":"12b3494977e5ff2063020f0ac0b8b2234e1760a8b7bf0be753fa3a03396b4106"} Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:33.061839 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-mwvfs" podUID="71453d1b-e7a6-44a3-a449-b1c10eb76997" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:33.147254 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d6fc7b549-vj5mb"] Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:33.156800 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d6fc7b549-vj5mb"] Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:33.226814 4852 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8397ac00_7478_4849_900c_cd19b2c83305.slice/crio-12b3494977e5ff2063020f0ac0b8b2234e1760a8b7bf0be753fa3a03396b4106\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8397ac00_7478_4849_900c_cd19b2c83305.slice\": RecentStats: unable to find data in memory cache]" Dec 10 12:13:33 crc kubenswrapper[4852]: I1210 12:13:33.433027 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-cfm8q" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:33.984372 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 10 12:13:33 crc kubenswrapper[4852]: E1210 12:13:33.985083 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5ch546hdbh5d4h7ch65fhbch669hddh55h557h8ch56dh5f7h668h6ch56h8h59dh5cch5ddh55dh5f4h56dh59ch559h64bh696hb7hdchc4h676q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnzd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(02730e24-a11e-4c7b-9470-9290b251bcb9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.048315 4852 scope.go:117] "RemoveContainer" containerID="9d2c597766444d2c6afca17c504b1f51e3ef2296956e65af43b8625ab0d52f6f" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.201990 4852 scope.go:117] "RemoveContainer" containerID="543624c02fa43b3688dd99e6eb961fbd2c189a47c98cebe3b1384eb4d096987c" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.209622 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8397ac00-7478-4849-900c-cd19b2c83305" path="/var/lib/kubelet/pods/8397ac00-7478-4849-900c-cd19b2c83305/volumes" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.210180 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85248dbd-6fba-48bf-9c4e-cb37fd6d12ec" path="/var/lib/kubelet/pods/85248dbd-6fba-48bf-9c4e-cb37fd6d12ec/volumes" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.210636 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1d381e-04cb-4b4f-bca5-d091b0b10dbe" path="/var/lib/kubelet/pods/8e1d381e-04cb-4b4f-bca5-d091b0b10dbe/volumes" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.293276 4852 scope.go:117] "RemoveContainer" containerID="1526ef670a66096e17d3fb224d460b0768d7f2150066a4bc7f3d701b213bd881" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.335964 4852 scope.go:117] "RemoveContainer" containerID="dd6985a6e7fa19d43bfb587d3319fece92772e7d9aa409320314fa5f34730405" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.362628 4852 scope.go:117] "RemoveContainer" containerID="ee8c936add246460659ca312173b12517adc514e1ffdf72843f884b96cdec1de" Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.623259 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cc955f7d4-bclr7"] Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.631843 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-955f9866d-84pn5"] Dec 10 12:13:34 crc kubenswrapper[4852]: W1210 12:13:34.639417 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35b770c5_bcea_4f68_8c5b_fb852f8b97a9.slice/crio-c496b0608bdf3cdc06b59d034a624056dc5e41d8f5a28d1c3a24f3712193118f WatchSource:0}: Error finding container c496b0608bdf3cdc06b59d034a624056dc5e41d8f5a28d1c3a24f3712193118f: Status 404 returned error can't find the container with id c496b0608bdf3cdc06b59d034a624056dc5e41d8f5a28d1c3a24f3712193118f Dec 10 12:13:34 crc kubenswrapper[4852]: W1210 12:13:34.652302 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2bdb4ea_d2a9_4974_9cc5_54b15afa1952.slice/crio-78b45f3b4b325654e6a45842fe89a4f984dfc6f4e00ff0a217c80846b57097b6 WatchSource:0}: Error finding container 78b45f3b4b325654e6a45842fe89a4f984dfc6f4e00ff0a217c80846b57097b6: Status 404 returned error can't find the container with id 78b45f3b4b325654e6a45842fe89a4f984dfc6f4e00ff0a217c80846b57097b6 Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.760311 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.783016 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l7ltl"] Dec 10 12:13:34 crc kubenswrapper[4852]: I1210 12:13:34.869656 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.094009 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c94757ccc-jfgtq" event={"ID":"fdd39a7d-3499-4846-82b1-452bd627dd23","Type":"ContainerStarted","Data":"b7897cdb820a26ec505ea7bfe1fd030efbd31ff4ee3fcb3391d5d68fe6f8a59b"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.094066 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c94757ccc-jfgtq" event={"ID":"fdd39a7d-3499-4846-82b1-452bd627dd23","Type":"ContainerStarted","Data":"947130cad1a59ee571cede0e13da15dd7fa9df8f7d910484092d50a1fbee5aac"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.094099 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c94757ccc-jfgtq" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerName="horizon-log" containerID="cri-o://947130cad1a59ee571cede0e13da15dd7fa9df8f7d910484092d50a1fbee5aac" gracePeriod=30 Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.094153 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c94757ccc-jfgtq" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerName="horizon" containerID="cri-o://b7897cdb820a26ec505ea7bfe1fd030efbd31ff4ee3fcb3391d5d68fe6f8a59b" gracePeriod=30 Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.104298 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l7ltl" event={"ID":"b0bf8197-7342-4289-ae56-f606b479778b","Type":"ContainerStarted","Data":"a12d1dad0892d1ad4f6e43f9da80378ad4e8833ac49b4deb3ce12933e4e5fd74"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.111606 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef7f8f3-6f82-406b-bce2-e127bcd7546b","Type":"ContainerStarted","Data":"8f8e3b4153a14f2604898087bf99b35152e1130b9a391ee6618143809891df52"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.117686 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-955f9866d-84pn5" event={"ID":"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952","Type":"ContainerStarted","Data":"78b45f3b4b325654e6a45842fe89a4f984dfc6f4e00ff0a217c80846b57097b6"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.118496 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c94757ccc-jfgtq" podStartSLOduration=2.129176858 podStartE2EDuration="40.118451485s" podCreationTimestamp="2025-12-10 12:12:55 +0000 UTC" firstStartedPulling="2025-12-10 12:12:56.01551317 +0000 UTC m=+1262.101038394" lastFinishedPulling="2025-12-10 12:13:34.004787797 +0000 UTC m=+1300.090313021" observedRunningTime="2025-12-10 12:13:35.11663863 +0000 UTC m=+1301.202163854" watchObservedRunningTime="2025-12-10 12:13:35.118451485 +0000 UTC m=+1301.203976709" Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.120405 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e410ffd3-2d2c-4665-95fd-e20c287c3151","Type":"ContainerStarted","Data":"cf1df9f9de452532550e75135f2ccbf612cf630a1fabd93430ab62d4aa53cd78"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.126499 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"3c62f223218c8d67bf458bba29b25f48f874ad6d23f1af6c44094e9bc123c137"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.129958 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc955f7d4-bclr7" event={"ID":"35b770c5-bcea-4f68-8c5b-fb852f8b97a9","Type":"ContainerStarted","Data":"c496b0608bdf3cdc06b59d034a624056dc5e41d8f5a28d1c3a24f3712193118f"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.138627 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ml9gs" event={"ID":"517ef493-1599-408d-bf6d-0e0eaef4d28c","Type":"ContainerStarted","Data":"1118f5b103cf03423c8ff6da9869b2aff3c5c22afe380c5ff63466b3dd5aaedd"} Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.164621 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ml9gs" podStartSLOduration=4.604295727 podStartE2EDuration="45.164606388s" podCreationTimestamp="2025-12-10 12:12:50 +0000 UTC" firstStartedPulling="2025-12-10 12:12:52.092364203 +0000 UTC m=+1258.177889427" lastFinishedPulling="2025-12-10 12:13:32.652674864 +0000 UTC m=+1298.738200088" observedRunningTime="2025-12-10 12:13:35.158883575 +0000 UTC m=+1301.244408809" watchObservedRunningTime="2025-12-10 12:13:35.164606388 +0000 UTC m=+1301.250131612" Dec 10 12:13:35 crc kubenswrapper[4852]: I1210 12:13:35.542311 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:13:36 crc kubenswrapper[4852]: I1210 12:13:36.156803 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e410ffd3-2d2c-4665-95fd-e20c287c3151","Type":"ContainerStarted","Data":"0fc8af81b80689c6cd891634190a7237468e2d4b4c5a654a4d08da0af40a9c8b"} Dec 10 12:13:36 crc kubenswrapper[4852]: I1210 12:13:36.158445 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l7ltl" event={"ID":"b0bf8197-7342-4289-ae56-f606b479778b","Type":"ContainerStarted","Data":"cd1eb2d39a5971f12ea73ff19a309c97ebb40e08d5efb16be5f6e6223ce830db"} Dec 10 12:13:36 crc kubenswrapper[4852]: I1210 12:13:36.203356 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef7f8f3-6f82-406b-bce2-e127bcd7546b","Type":"ContainerStarted","Data":"8296d4bfca3c0f45af44447d3855b307baf52ccaeced35113849b844ba1722d6"} Dec 10 12:13:36 crc kubenswrapper[4852]: I1210 12:13:36.203576 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-955f9866d-84pn5" event={"ID":"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952","Type":"ContainerStarted","Data":"cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f"} Dec 10 12:13:36 crc kubenswrapper[4852]: I1210 12:13:36.225430 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc955f7d4-bclr7" event={"ID":"35b770c5-bcea-4f68-8c5b-fb852f8b97a9","Type":"ContainerStarted","Data":"dd8b6bdfed1f86ccbb51167fef0e352f506fbb3be92711f0108561121cd2db54"} Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.232144 4852 generic.go:334] "Generic (PLEG): container finished" podID="5db4372d-41b4-4247-97ab-7f27026c2a82" containerID="f5a0bcf57bba2b7b080f1839bce4a9e1453126b83a24288123ea4b72d204ad33" exitCode=0 Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.232236 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bw8zp" event={"ID":"5db4372d-41b4-4247-97ab-7f27026c2a82","Type":"ContainerDied","Data":"f5a0bcf57bba2b7b080f1839bce4a9e1453126b83a24288123ea4b72d204ad33"} Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.235339 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e410ffd3-2d2c-4665-95fd-e20c287c3151","Type":"ContainerStarted","Data":"193f80cb3a9f9f3ddc44ffdd85d5ec927cc733119a9e5aca24aa455929ade979"} Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.238178 4852 generic.go:334] "Generic (PLEG): container finished" podID="517ef493-1599-408d-bf6d-0e0eaef4d28c" containerID="1118f5b103cf03423c8ff6da9869b2aff3c5c22afe380c5ff63466b3dd5aaedd" exitCode=0 Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.238243 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ml9gs" event={"ID":"517ef493-1599-408d-bf6d-0e0eaef4d28c","Type":"ContainerDied","Data":"1118f5b103cf03423c8ff6da9869b2aff3c5c22afe380c5ff63466b3dd5aaedd"} Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.239533 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02730e24-a11e-4c7b-9470-9290b251bcb9","Type":"ContainerStarted","Data":"5b3c265eac126455f9541d52bf3d5da25b3b2ffcd55c59ecbfae2633f19d7cee"} Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.240969 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef7f8f3-6f82-406b-bce2-e127bcd7546b","Type":"ContainerStarted","Data":"71e76799c782cd80bc5010505abffa377286f23ce9fd5e171fe455c5f5ce5ec0"} Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.241094 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerName="glance-log" containerID="cri-o://8296d4bfca3c0f45af44447d3855b307baf52ccaeced35113849b844ba1722d6" gracePeriod=30 Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.241423 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerName="glance-httpd" containerID="cri-o://71e76799c782cd80bc5010505abffa377286f23ce9fd5e171fe455c5f5ce5ec0" gracePeriod=30 Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.257126 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-l7ltl" podStartSLOduration=6.257102214 podStartE2EDuration="6.257102214s" podCreationTimestamp="2025-12-10 12:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:36.192845541 +0000 UTC m=+1302.278370765" watchObservedRunningTime="2025-12-10 12:13:37.257102214 +0000 UTC m=+1303.342627458" Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.259863 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-955f9866d-84pn5" event={"ID":"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952","Type":"ContainerStarted","Data":"7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893"} Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.264113 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc955f7d4-bclr7" event={"ID":"35b770c5-bcea-4f68-8c5b-fb852f8b97a9","Type":"ContainerStarted","Data":"1eab3ce93a6444be0221f504896bd9955324166ddcff16bc5ba34c3c533ea21f"} Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.301516 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=38.301495613 podStartE2EDuration="38.301495613s" podCreationTimestamp="2025-12-10 12:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:37.299926334 +0000 UTC m=+1303.385451568" watchObservedRunningTime="2025-12-10 12:13:37.301495613 +0000 UTC m=+1303.387020847" Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.319256 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=26.319219336 podStartE2EDuration="26.319219336s" podCreationTimestamp="2025-12-10 12:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:37.277528664 +0000 UTC m=+1303.363053898" watchObservedRunningTime="2025-12-10 12:13:37.319219336 +0000 UTC m=+1303.404744560" Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.348269 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-cc955f7d4-bclr7" podStartSLOduration=36.34819957 podStartE2EDuration="36.34819957s" podCreationTimestamp="2025-12-10 12:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:37.335489962 +0000 UTC m=+1303.421015206" watchObservedRunningTime="2025-12-10 12:13:37.34819957 +0000 UTC m=+1303.433724794" Dec 10 12:13:37 crc kubenswrapper[4852]: I1210 12:13:37.393767 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-955f9866d-84pn5" podStartSLOduration=36.393741238 podStartE2EDuration="36.393741238s" podCreationTimestamp="2025-12-10 12:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:37.361525143 +0000 UTC m=+1303.447050387" watchObservedRunningTime="2025-12-10 12:13:37.393741238 +0000 UTC m=+1303.479266472" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.277696 4852 generic.go:334] "Generic (PLEG): container finished" podID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerID="71e76799c782cd80bc5010505abffa377286f23ce9fd5e171fe455c5f5ce5ec0" exitCode=0 Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.277968 4852 generic.go:334] "Generic (PLEG): container finished" podID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerID="8296d4bfca3c0f45af44447d3855b307baf52ccaeced35113849b844ba1722d6" exitCode=143 Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.277777 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef7f8f3-6f82-406b-bce2-e127bcd7546b","Type":"ContainerDied","Data":"71e76799c782cd80bc5010505abffa377286f23ce9fd5e171fe455c5f5ce5ec0"} Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.278102 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef7f8f3-6f82-406b-bce2-e127bcd7546b","Type":"ContainerDied","Data":"8296d4bfca3c0f45af44447d3855b307baf52ccaeced35113849b844ba1722d6"} Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.709257 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.712055 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ml9gs" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.882416 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-config\") pod \"5db4372d-41b4-4247-97ab-7f27026c2a82\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.882536 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-combined-ca-bundle\") pod \"5db4372d-41b4-4247-97ab-7f27026c2a82\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.882609 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-scripts\") pod \"517ef493-1599-408d-bf6d-0e0eaef4d28c\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.882636 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-config-data\") pod \"517ef493-1599-408d-bf6d-0e0eaef4d28c\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.882662 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-combined-ca-bundle\") pod \"517ef493-1599-408d-bf6d-0e0eaef4d28c\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.882690 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56x6w\" (UniqueName: \"kubernetes.io/projected/5db4372d-41b4-4247-97ab-7f27026c2a82-kube-api-access-56x6w\") pod \"5db4372d-41b4-4247-97ab-7f27026c2a82\" (UID: \"5db4372d-41b4-4247-97ab-7f27026c2a82\") " Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.882706 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/517ef493-1599-408d-bf6d-0e0eaef4d28c-logs\") pod \"517ef493-1599-408d-bf6d-0e0eaef4d28c\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.882741 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pt6x\" (UniqueName: \"kubernetes.io/projected/517ef493-1599-408d-bf6d-0e0eaef4d28c-kube-api-access-9pt6x\") pod \"517ef493-1599-408d-bf6d-0e0eaef4d28c\" (UID: \"517ef493-1599-408d-bf6d-0e0eaef4d28c\") " Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.883373 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517ef493-1599-408d-bf6d-0e0eaef4d28c-logs" (OuterVolumeSpecName: "logs") pod "517ef493-1599-408d-bf6d-0e0eaef4d28c" (UID: "517ef493-1599-408d-bf6d-0e0eaef4d28c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.889275 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-scripts" (OuterVolumeSpecName: "scripts") pod "517ef493-1599-408d-bf6d-0e0eaef4d28c" (UID: "517ef493-1599-408d-bf6d-0e0eaef4d28c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.902724 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db4372d-41b4-4247-97ab-7f27026c2a82-kube-api-access-56x6w" (OuterVolumeSpecName: "kube-api-access-56x6w") pod "5db4372d-41b4-4247-97ab-7f27026c2a82" (UID: "5db4372d-41b4-4247-97ab-7f27026c2a82"). InnerVolumeSpecName "kube-api-access-56x6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.909457 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517ef493-1599-408d-bf6d-0e0eaef4d28c-kube-api-access-9pt6x" (OuterVolumeSpecName: "kube-api-access-9pt6x") pod "517ef493-1599-408d-bf6d-0e0eaef4d28c" (UID: "517ef493-1599-408d-bf6d-0e0eaef4d28c"). InnerVolumeSpecName "kube-api-access-9pt6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.917788 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-config-data" (OuterVolumeSpecName: "config-data") pod "517ef493-1599-408d-bf6d-0e0eaef4d28c" (UID: "517ef493-1599-408d-bf6d-0e0eaef4d28c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.919667 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-config" (OuterVolumeSpecName: "config") pod "5db4372d-41b4-4247-97ab-7f27026c2a82" (UID: "5db4372d-41b4-4247-97ab-7f27026c2a82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.920094 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5db4372d-41b4-4247-97ab-7f27026c2a82" (UID: "5db4372d-41b4-4247-97ab-7f27026c2a82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.920583 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "517ef493-1599-408d-bf6d-0e0eaef4d28c" (UID: "517ef493-1599-408d-bf6d-0e0eaef4d28c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.985981 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.986043 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.986054 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.986433 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ef493-1599-408d-bf6d-0e0eaef4d28c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.986454 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56x6w\" (UniqueName: \"kubernetes.io/projected/5db4372d-41b4-4247-97ab-7f27026c2a82-kube-api-access-56x6w\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.986467 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/517ef493-1599-408d-bf6d-0e0eaef4d28c-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.986478 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pt6x\" (UniqueName: \"kubernetes.io/projected/517ef493-1599-408d-bf6d-0e0eaef4d28c-kube-api-access-9pt6x\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:38 crc kubenswrapper[4852]: I1210 12:13:38.986490 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5db4372d-41b4-4247-97ab-7f27026c2a82-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.306685 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ml9gs" event={"ID":"517ef493-1599-408d-bf6d-0e0eaef4d28c","Type":"ContainerDied","Data":"0c6b0aa9969d1daef16fbc43fbfa32ee82f269558a3c2949dd00b1a54243bf79"} Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.307023 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c6b0aa9969d1daef16fbc43fbfa32ee82f269558a3c2949dd00b1a54243bf79" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.307095 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ml9gs" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.314696 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bw8zp" event={"ID":"5db4372d-41b4-4247-97ab-7f27026c2a82","Type":"ContainerDied","Data":"8ad4c344ae7d8bb5aec287e20dcc5e5bf5807ec74c444cfd4d5aad94210ad599"} Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.314738 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ad4c344ae7d8bb5aec287e20dcc5e5bf5807ec74c444cfd4d5aad94210ad599" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.314822 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bw8zp" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.559270 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db947f9b4-m6rgq"] Dec 10 12:13:39 crc kubenswrapper[4852]: E1210 12:13:39.560640 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db4372d-41b4-4247-97ab-7f27026c2a82" containerName="neutron-db-sync" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.560667 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db4372d-41b4-4247-97ab-7f27026c2a82" containerName="neutron-db-sync" Dec 10 12:13:39 crc kubenswrapper[4852]: E1210 12:13:39.560683 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517ef493-1599-408d-bf6d-0e0eaef4d28c" containerName="placement-db-sync" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.560691 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="517ef493-1599-408d-bf6d-0e0eaef4d28c" containerName="placement-db-sync" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.561428 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db4372d-41b4-4247-97ab-7f27026c2a82" containerName="neutron-db-sync" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.561478 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="517ef493-1599-408d-bf6d-0e0eaef4d28c" containerName="placement-db-sync" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.563943 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.567066 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.567428 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.567577 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.567688 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xsg7h" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.574195 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.581211 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db947f9b4-m6rgq"] Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.669384 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-68cl7"] Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.683907 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.710028 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ebd8c65-e675-462a-bdba-db5d0ea01754-logs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.710485 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-public-tls-certs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.710589 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-scripts\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.710689 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-combined-ca-bundle\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.710815 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-config-data\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.711004 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-internal-tls-certs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.711170 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmrc9\" (UniqueName: \"kubernetes.io/projected/1ebd8c65-e675-462a-bdba-db5d0ea01754-kube-api-access-mmrc9\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.714649 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-68cl7"] Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814532 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814586 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-public-tls-certs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814620 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-config\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814655 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-scripts\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814698 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-combined-ca-bundle\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814724 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-config-data\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814768 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814805 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q95v\" (UniqueName: \"kubernetes.io/projected/2bb0fcab-02ae-40d7-acbf-8976c76d312a-kube-api-access-7q95v\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814835 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-internal-tls-certs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814891 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmrc9\" (UniqueName: \"kubernetes.io/projected/1ebd8c65-e675-462a-bdba-db5d0ea01754-kube-api-access-mmrc9\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814918 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814947 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ebd8c65-e675-462a-bdba-db5d0ea01754-logs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.814969 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.822789 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ebd8c65-e675-462a-bdba-db5d0ea01754-logs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.832009 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-config-data\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.838976 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-combined-ca-bundle\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.849677 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-scripts\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.850711 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-public-tls-certs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.857270 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebd8c65-e675-462a-bdba-db5d0ea01754-internal-tls-certs\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.871279 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmrc9\" (UniqueName: \"kubernetes.io/projected/1ebd8c65-e675-462a-bdba-db5d0ea01754-kube-api-access-mmrc9\") pod \"placement-db947f9b4-m6rgq\" (UID: \"1ebd8c65-e675-462a-bdba-db5d0ea01754\") " pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.899370 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.916164 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.916276 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.916309 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-config\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.916401 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.916444 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q95v\" (UniqueName: \"kubernetes.io/projected/2bb0fcab-02ae-40d7-acbf-8976c76d312a-kube-api-access-7q95v\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.916523 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.917461 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.918114 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.918846 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.919071 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.923431 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-config\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:39 crc kubenswrapper[4852]: I1210 12:13:39.954185 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q95v\" (UniqueName: \"kubernetes.io/projected/2bb0fcab-02ae-40d7-acbf-8976c76d312a-kube-api-access-7q95v\") pod \"dnsmasq-dns-6b7b667979-68cl7\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.023017 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b4f948f96-w6dtl"] Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.024847 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.034964 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.035317 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wxml9" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.035482 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.035570 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.038638 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.043424 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b4f948f96-w6dtl"] Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.121370 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-ovndb-tls-certs\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.121437 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-combined-ca-bundle\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.121458 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-httpd-config\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.121517 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-config\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.121535 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp86j\" (UniqueName: \"kubernetes.io/projected/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-kube-api-access-hp86j\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.226305 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp86j\" (UniqueName: \"kubernetes.io/projected/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-kube-api-access-hp86j\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.226346 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-config\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.226442 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-ovndb-tls-certs\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.226485 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-combined-ca-bundle\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.226504 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-httpd-config\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.230019 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-ovndb-tls-certs\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.230628 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-combined-ca-bundle\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.246829 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-httpd-config\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.253666 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-config\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.254000 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp86j\" (UniqueName: \"kubernetes.io/projected/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-kube-api-access-hp86j\") pod \"neutron-b4f948f96-w6dtl\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:40 crc kubenswrapper[4852]: I1210 12:13:40.369145 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:41 crc kubenswrapper[4852]: I1210 12:13:41.725516 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:41 crc kubenswrapper[4852]: I1210 12:13:41.725888 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:13:41 crc kubenswrapper[4852]: I1210 12:13:41.778060 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:41 crc kubenswrapper[4852]: I1210 12:13:41.778102 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.142051 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9dd466c4f-pgb9f"] Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.144191 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.146742 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.154773 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.163742 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9dd466c4f-pgb9f"] Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.271635 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-config\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.271717 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wkln\" (UniqueName: \"kubernetes.io/projected/63687f02-3cc2-4640-88f1-e312bbe550e7-kube-api-access-5wkln\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.271763 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-internal-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.271802 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-ovndb-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.271891 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-combined-ca-bundle\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.271992 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-httpd-config\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.272042 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-public-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.370956 4852 generic.go:334] "Generic (PLEG): container finished" podID="b0bf8197-7342-4289-ae56-f606b479778b" containerID="cd1eb2d39a5971f12ea73ff19a309c97ebb40e08d5efb16be5f6e6223ce830db" exitCode=0 Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.370997 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l7ltl" event={"ID":"b0bf8197-7342-4289-ae56-f606b479778b","Type":"ContainerDied","Data":"cd1eb2d39a5971f12ea73ff19a309c97ebb40e08d5efb16be5f6e6223ce830db"} Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.373647 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wkln\" (UniqueName: \"kubernetes.io/projected/63687f02-3cc2-4640-88f1-e312bbe550e7-kube-api-access-5wkln\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.373682 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-internal-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.373716 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-ovndb-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.373738 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-combined-ca-bundle\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.373807 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-httpd-config\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.373830 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-public-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.373859 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-config\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.380282 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-config\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.380909 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-internal-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.383071 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-combined-ca-bundle\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.384659 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-ovndb-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.397356 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-httpd-config\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.400056 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wkln\" (UniqueName: \"kubernetes.io/projected/63687f02-3cc2-4640-88f1-e312bbe550e7-kube-api-access-5wkln\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.408371 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63687f02-3cc2-4640-88f1-e312bbe550e7-public-tls-certs\") pod \"neutron-9dd466c4f-pgb9f\" (UID: \"63687f02-3cc2-4640-88f1-e312bbe550e7\") " pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.466648 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.602692 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.602736 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.603103 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.603129 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.653043 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:42 crc kubenswrapper[4852]: I1210 12:13:42.658876 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.628747 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.741031 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jnfv\" (UniqueName: \"kubernetes.io/projected/cef7f8f3-6f82-406b-bce2-e127bcd7546b-kube-api-access-5jnfv\") pod \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.742511 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-config-data\") pod \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.742636 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-combined-ca-bundle\") pod \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.742685 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-public-tls-certs\") pod \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.742716 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-scripts\") pod \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.742754 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.742777 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-httpd-run\") pod \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.742798 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-logs\") pod \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\" (UID: \"cef7f8f3-6f82-406b-bce2-e127bcd7546b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.743873 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-logs" (OuterVolumeSpecName: "logs") pod "cef7f8f3-6f82-406b-bce2-e127bcd7546b" (UID: "cef7f8f3-6f82-406b-bce2-e127bcd7546b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.743983 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cef7f8f3-6f82-406b-bce2-e127bcd7546b" (UID: "cef7f8f3-6f82-406b-bce2-e127bcd7546b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.751580 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef7f8f3-6f82-406b-bce2-e127bcd7546b-kube-api-access-5jnfv" (OuterVolumeSpecName: "kube-api-access-5jnfv") pod "cef7f8f3-6f82-406b-bce2-e127bcd7546b" (UID: "cef7f8f3-6f82-406b-bce2-e127bcd7546b"). InnerVolumeSpecName "kube-api-access-5jnfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.755480 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-scripts" (OuterVolumeSpecName: "scripts") pod "cef7f8f3-6f82-406b-bce2-e127bcd7546b" (UID: "cef7f8f3-6f82-406b-bce2-e127bcd7546b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.762473 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "cef7f8f3-6f82-406b-bce2-e127bcd7546b" (UID: "cef7f8f3-6f82-406b-bce2-e127bcd7546b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.845145 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.845240 4852 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.845258 4852 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.845271 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef7f8f3-6f82-406b-bce2-e127bcd7546b-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.845284 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jnfv\" (UniqueName: \"kubernetes.io/projected/cef7f8f3-6f82-406b-bce2-e127bcd7546b-kube-api-access-5jnfv\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.864891 4852 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.866551 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cef7f8f3-6f82-406b-bce2-e127bcd7546b" (UID: "cef7f8f3-6f82-406b-bce2-e127bcd7546b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.882581 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-config-data" (OuterVolumeSpecName: "config-data") pod "cef7f8f3-6f82-406b-bce2-e127bcd7546b" (UID: "cef7f8f3-6f82-406b-bce2-e127bcd7546b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.893259 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cef7f8f3-6f82-406b-bce2-e127bcd7546b" (UID: "cef7f8f3-6f82-406b-bce2-e127bcd7546b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.899010 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.946427 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-credential-keys\") pod \"b0bf8197-7342-4289-ae56-f606b479778b\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.946547 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-config-data\") pod \"b0bf8197-7342-4289-ae56-f606b479778b\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.946596 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqqcz\" (UniqueName: \"kubernetes.io/projected/b0bf8197-7342-4289-ae56-f606b479778b-kube-api-access-mqqcz\") pod \"b0bf8197-7342-4289-ae56-f606b479778b\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.946619 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-combined-ca-bundle\") pod \"b0bf8197-7342-4289-ae56-f606b479778b\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.946684 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-fernet-keys\") pod \"b0bf8197-7342-4289-ae56-f606b479778b\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.946756 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-scripts\") pod \"b0bf8197-7342-4289-ae56-f606b479778b\" (UID: \"b0bf8197-7342-4289-ae56-f606b479778b\") " Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.947773 4852 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.947802 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.947815 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.947826 4852 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef7f8f3-6f82-406b-bce2-e127bcd7546b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.969692 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-scripts" (OuterVolumeSpecName: "scripts") pod "b0bf8197-7342-4289-ae56-f606b479778b" (UID: "b0bf8197-7342-4289-ae56-f606b479778b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.974850 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0bf8197-7342-4289-ae56-f606b479778b-kube-api-access-mqqcz" (OuterVolumeSpecName: "kube-api-access-mqqcz") pod "b0bf8197-7342-4289-ae56-f606b479778b" (UID: "b0bf8197-7342-4289-ae56-f606b479778b"). InnerVolumeSpecName "kube-api-access-mqqcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.975027 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0bf8197-7342-4289-ae56-f606b479778b" (UID: "b0bf8197-7342-4289-ae56-f606b479778b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:45 crc kubenswrapper[4852]: I1210 12:13:45.982369 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0bf8197-7342-4289-ae56-f606b479778b" (UID: "b0bf8197-7342-4289-ae56-f606b479778b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.049569 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqqcz\" (UniqueName: \"kubernetes.io/projected/b0bf8197-7342-4289-ae56-f606b479778b-kube-api-access-mqqcz\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.049629 4852 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.049645 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.049657 4852 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.056160 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0bf8197-7342-4289-ae56-f606b479778b" (UID: "b0bf8197-7342-4289-ae56-f606b479778b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.057099 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-config-data" (OuterVolumeSpecName: "config-data") pod "b0bf8197-7342-4289-ae56-f606b479778b" (UID: "b0bf8197-7342-4289-ae56-f606b479778b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.150920 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.151141 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bf8197-7342-4289-ae56-f606b479778b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.213769 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.213852 4852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.284740 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db947f9b4-m6rgq"] Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.435414 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l7ltl" event={"ID":"b0bf8197-7342-4289-ae56-f606b479778b","Type":"ContainerDied","Data":"a12d1dad0892d1ad4f6e43f9da80378ad4e8833ac49b4deb3ce12933e4e5fd74"} Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.435458 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a12d1dad0892d1ad4f6e43f9da80378ad4e8833ac49b4deb3ce12933e4e5fd74" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.435536 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l7ltl" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.464526 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02730e24-a11e-4c7b-9470-9290b251bcb9","Type":"ContainerStarted","Data":"f58c4ff28c6889725dde23eb4f974a70d19c67989fea7548a02e74df19d045f2"} Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.465317 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.473875 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.474531 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef7f8f3-6f82-406b-bce2-e127bcd7546b","Type":"ContainerDied","Data":"8f8e3b4153a14f2604898087bf99b35152e1130b9a391ee6618143809891df52"} Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.474585 4852 scope.go:117] "RemoveContainer" containerID="71e76799c782cd80bc5010505abffa377286f23ce9fd5e171fe455c5f5ce5ec0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.480560 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db947f9b4-m6rgq" event={"ID":"1ebd8c65-e675-462a-bdba-db5d0ea01754","Type":"ContainerStarted","Data":"60c16ac762b9d8eb4ebf99f75c724b8c8194fd688dd028c859e08fab060dc2ae"} Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.489943 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b4f948f96-w6dtl"] Dec 10 12:13:46 crc kubenswrapper[4852]: W1210 12:13:46.490363 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1348b87e_f303_4f39_9cf9_aa55ca6b0fd4.slice/crio-8be44bdf0542fe2fc1928a7f9def21427ba9e09d95adf4f6dfcf573522c5dabc WatchSource:0}: Error finding container 8be44bdf0542fe2fc1928a7f9def21427ba9e09d95adf4f6dfcf573522c5dabc: Status 404 returned error can't find the container with id 8be44bdf0542fe2fc1928a7f9def21427ba9e09d95adf4f6dfcf573522c5dabc Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.504796 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-68cl7"] Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.543784 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.566723 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.582220 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:13:46 crc kubenswrapper[4852]: E1210 12:13:46.583248 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerName="glance-httpd" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.583267 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerName="glance-httpd" Dec 10 12:13:46 crc kubenswrapper[4852]: E1210 12:13:46.583289 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerName="glance-log" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.583294 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerName="glance-log" Dec 10 12:13:46 crc kubenswrapper[4852]: E1210 12:13:46.583306 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0bf8197-7342-4289-ae56-f606b479778b" containerName="keystone-bootstrap" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.583312 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0bf8197-7342-4289-ae56-f606b479778b" containerName="keystone-bootstrap" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.583484 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerName="glance-log" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.583499 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" containerName="glance-httpd" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.583511 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0bf8197-7342-4289-ae56-f606b479778b" containerName="keystone-bootstrap" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.585104 4852 scope.go:117] "RemoveContainer" containerID="8296d4bfca3c0f45af44447d3855b307baf52ccaeced35113849b844ba1722d6" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.587470 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.594905 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.596836 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.596924 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.674278 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-logs\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.674410 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.674457 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.674526 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-config-data\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.674541 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-scripts\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.674573 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.674597 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpj5\" (UniqueName: \"kubernetes.io/projected/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-kube-api-access-bjpj5\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.674702 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.696447 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9dd466c4f-pgb9f"] Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.783361 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-logs\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.783458 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.783495 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.783542 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-config-data\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.783568 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-scripts\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.783597 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.783628 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpj5\" (UniqueName: \"kubernetes.io/projected/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-kube-api-access-bjpj5\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.783698 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.786054 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.794928 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.795382 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-logs\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.798162 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.804656 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-scripts\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.819114 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.825719 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-config-data\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.833836 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpj5\" (UniqueName: \"kubernetes.io/projected/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-kube-api-access-bjpj5\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.843266 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " pod="openstack/glance-default-external-api-0" Dec 10 12:13:46 crc kubenswrapper[4852]: I1210 12:13:46.997612 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.069495 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7dd8c6757f-lbdxp"] Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.070592 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.078468 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.078774 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.078948 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.079481 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rs7cl" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.079765 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.082866 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.089709 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md9st\" (UniqueName: \"kubernetes.io/projected/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-kube-api-access-md9st\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.089760 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-fernet-keys\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.089803 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-public-tls-certs\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.092002 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-scripts\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.092067 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-combined-ca-bundle\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.092113 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-config-data\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.092139 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-credential-keys\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.092225 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-internal-tls-certs\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.110924 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7dd8c6757f-lbdxp"] Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.193921 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-scripts\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.194182 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-combined-ca-bundle\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.194215 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-config-data\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.194255 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-credential-keys\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.194332 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-internal-tls-certs\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.194414 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md9st\" (UniqueName: \"kubernetes.io/projected/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-kube-api-access-md9st\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.194444 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-fernet-keys\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.194472 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-public-tls-certs\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.201305 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-config-data\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.201663 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-credential-keys\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.202601 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-combined-ca-bundle\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.205123 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-public-tls-certs\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.205903 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-fernet-keys\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.212753 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-scripts\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.213021 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-internal-tls-certs\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.229721 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md9st\" (UniqueName: \"kubernetes.io/projected/fbe4801a-4ecc-4ecd-b00d-da9917481e2e-kube-api-access-md9st\") pod \"keystone-7dd8c6757f-lbdxp\" (UID: \"fbe4801a-4ecc-4ecd-b00d-da9917481e2e\") " pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.418540 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.506842 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mwvfs" event={"ID":"71453d1b-e7a6-44a3-a449-b1c10eb76997","Type":"ContainerStarted","Data":"fd502ff720569629780faae2475f0059b48c63ab992be0e2b745816d33a1238f"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.521746 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qwzww" event={"ID":"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c","Type":"ContainerStarted","Data":"22ecf46a2bcdff4b0e33d8ff9c3cf00f444cde102496f3e4cc42c286b898cac4"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.530132 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mwvfs" podStartSLOduration=4.6486752110000005 podStartE2EDuration="58.530116975s" podCreationTimestamp="2025-12-10 12:12:49 +0000 UTC" firstStartedPulling="2025-12-10 12:12:51.965578395 +0000 UTC m=+1258.051103619" lastFinishedPulling="2025-12-10 12:13:45.847020159 +0000 UTC m=+1311.932545383" observedRunningTime="2025-12-10 12:13:47.527220623 +0000 UTC m=+1313.612745847" watchObservedRunningTime="2025-12-10 12:13:47.530116975 +0000 UTC m=+1313.615642199" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.535351 4852 generic.go:334] "Generic (PLEG): container finished" podID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" containerID="7208f430134cd22178b78f5ff83b54b2784c642f6277cb9dd9ec6b4375e67d46" exitCode=0 Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.535435 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" event={"ID":"2bb0fcab-02ae-40d7-acbf-8976c76d312a","Type":"ContainerDied","Data":"7208f430134cd22178b78f5ff83b54b2784c642f6277cb9dd9ec6b4375e67d46"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.535473 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" event={"ID":"2bb0fcab-02ae-40d7-acbf-8976c76d312a","Type":"ContainerStarted","Data":"338a0b3eaea39f9cad45cbb043badacff745a90d92463685d21cd7df34a23606"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.559676 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9dd466c4f-pgb9f" event={"ID":"63687f02-3cc2-4640-88f1-e312bbe550e7","Type":"ContainerStarted","Data":"dd4fc5ee3fa832ade67083f44bee4366316dc39ba10513e77fca2b066a64f6ab"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.559732 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9dd466c4f-pgb9f" event={"ID":"63687f02-3cc2-4640-88f1-e312bbe550e7","Type":"ContainerStarted","Data":"8ad2b5cef31d4ce4e707aa3fa3b858a6550cab485d559af0717aa1f4ccaa8fb3"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.579111 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qwzww" podStartSLOduration=4.429598588 podStartE2EDuration="58.579092589s" podCreationTimestamp="2025-12-10 12:12:49 +0000 UTC" firstStartedPulling="2025-12-10 12:12:51.628048532 +0000 UTC m=+1257.713573766" lastFinishedPulling="2025-12-10 12:13:45.777542543 +0000 UTC m=+1311.863067767" observedRunningTime="2025-12-10 12:13:47.548021133 +0000 UTC m=+1313.633546347" watchObservedRunningTime="2025-12-10 12:13:47.579092589 +0000 UTC m=+1313.664617813" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.598090 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db947f9b4-m6rgq" event={"ID":"1ebd8c65-e675-462a-bdba-db5d0ea01754","Type":"ContainerStarted","Data":"68117fb6c6773efc7d65271bdb5cbc0963ef02165af0c24f750754e51352a4bd"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.599222 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.599280 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.620132 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f948f96-w6dtl" event={"ID":"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4","Type":"ContainerStarted","Data":"a8f773664888bdd65d6b174dc3cf8be92750d95f5151acf68f3f4f2cf0653eaa"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.620177 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f948f96-w6dtl" event={"ID":"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4","Type":"ContainerStarted","Data":"4824a062051e476f736f433329611d5178f7dfd6175515388a95609127d1e54b"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.620188 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f948f96-w6dtl" event={"ID":"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4","Type":"ContainerStarted","Data":"8be44bdf0542fe2fc1928a7f9def21427ba9e09d95adf4f6dfcf573522c5dabc"} Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.621354 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.630168 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db947f9b4-m6rgq" podStartSLOduration=8.630152485 podStartE2EDuration="8.630152485s" podCreationTimestamp="2025-12-10 12:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:47.625596231 +0000 UTC m=+1313.711121465" watchObservedRunningTime="2025-12-10 12:13:47.630152485 +0000 UTC m=+1313.715677709" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.655058 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b4f948f96-w6dtl" podStartSLOduration=8.655039407 podStartE2EDuration="8.655039407s" podCreationTimestamp="2025-12-10 12:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:47.654686348 +0000 UTC m=+1313.740211572" watchObservedRunningTime="2025-12-10 12:13:47.655039407 +0000 UTC m=+1313.740564631" Dec 10 12:13:47 crc kubenswrapper[4852]: I1210 12:13:47.825395 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.081573 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7dd8c6757f-lbdxp"] Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.193176 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef7f8f3-6f82-406b-bce2-e127bcd7546b" path="/var/lib/kubelet/pods/cef7f8f3-6f82-406b-bce2-e127bcd7546b/volumes" Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.637186 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" event={"ID":"2bb0fcab-02ae-40d7-acbf-8976c76d312a","Type":"ContainerStarted","Data":"4f1b8e3ad9482d05b0a4032c5faaa7a0537cf87c94ebcb7ccf4294ee6cc4c8da"} Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.637891 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.659052 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9dd466c4f-pgb9f" event={"ID":"63687f02-3cc2-4640-88f1-e312bbe550e7","Type":"ContainerStarted","Data":"06f8c4d21658151fa183253c152de43e51ca4f550a8f7c71c99f678fe78ae97d"} Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.663715 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.666757 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" podStartSLOduration=9.666742435 podStartE2EDuration="9.666742435s" podCreationTimestamp="2025-12-10 12:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:48.663689029 +0000 UTC m=+1314.749214273" watchObservedRunningTime="2025-12-10 12:13:48.666742435 +0000 UTC m=+1314.752267659" Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.672612 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7dd8c6757f-lbdxp" event={"ID":"fbe4801a-4ecc-4ecd-b00d-da9917481e2e","Type":"ContainerStarted","Data":"2f15e3d02d552789b59caba47c0d59c0a96be20c78e798ae14b58ef68d65c353"} Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.672665 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7dd8c6757f-lbdxp" event={"ID":"fbe4801a-4ecc-4ecd-b00d-da9917481e2e","Type":"ContainerStarted","Data":"4e2096a7775fb7c9ac77834d135cf372fb88b9a86ed3bd7e3169d3ab2b293af8"} Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.673303 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.677985 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65bcb5c-dfe8-4412-8fbf-7e717ab28750","Type":"ContainerStarted","Data":"1272a9cac8b7bc11125b0e8689ad82ac95352cff64a81c37270dc33c31e0e7a4"} Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.678027 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65bcb5c-dfe8-4412-8fbf-7e717ab28750","Type":"ContainerStarted","Data":"659d3f2e69352b280e0d66b788d9908372f2352e2a753e4c3755267ed352a956"} Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.697415 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9dd466c4f-pgb9f" podStartSLOduration=6.697398061 podStartE2EDuration="6.697398061s" podCreationTimestamp="2025-12-10 12:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:48.696187131 +0000 UTC m=+1314.781712365" watchObservedRunningTime="2025-12-10 12:13:48.697398061 +0000 UTC m=+1314.782923305" Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.703086 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db947f9b4-m6rgq" event={"ID":"1ebd8c65-e675-462a-bdba-db5d0ea01754","Type":"ContainerStarted","Data":"7c32402da8080c50031007b859c340522214af746f2f4b7facbb11c819321b86"} Dec 10 12:13:48 crc kubenswrapper[4852]: I1210 12:13:48.722916 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7dd8c6757f-lbdxp" podStartSLOduration=1.7228946189999998 podStartE2EDuration="1.722894619s" podCreationTimestamp="2025-12-10 12:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:48.716194551 +0000 UTC m=+1314.801719775" watchObservedRunningTime="2025-12-10 12:13:48.722894619 +0000 UTC m=+1314.808419853" Dec 10 12:13:49 crc kubenswrapper[4852]: I1210 12:13:49.713335 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65bcb5c-dfe8-4412-8fbf-7e717ab28750","Type":"ContainerStarted","Data":"ee39b662ae77cbfd9718a2674a3f69996c0ca865953d7e740f76e47c7b9d6aed"} Dec 10 12:13:49 crc kubenswrapper[4852]: I1210 12:13:49.749357 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.749339397 podStartE2EDuration="3.749339397s" podCreationTimestamp="2025-12-10 12:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:13:49.741556812 +0000 UTC m=+1315.827082036" watchObservedRunningTime="2025-12-10 12:13:49.749339397 +0000 UTC m=+1315.834864621" Dec 10 12:13:50 crc kubenswrapper[4852]: I1210 12:13:50.726776 4852 generic.go:334] "Generic (PLEG): container finished" podID="71453d1b-e7a6-44a3-a449-b1c10eb76997" containerID="fd502ff720569629780faae2475f0059b48c63ab992be0e2b745816d33a1238f" exitCode=0 Dec 10 12:13:50 crc kubenswrapper[4852]: I1210 12:13:50.726868 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mwvfs" event={"ID":"71453d1b-e7a6-44a3-a449-b1c10eb76997","Type":"ContainerDied","Data":"fd502ff720569629780faae2475f0059b48c63ab992be0e2b745816d33a1238f"} Dec 10 12:13:51 crc kubenswrapper[4852]: I1210 12:13:51.727105 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-955f9866d-84pn5" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 10 12:13:51 crc kubenswrapper[4852]: I1210 12:13:51.779281 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-cc955f7d4-bclr7" podUID="35b770c5-bcea-4f68-8c5b-fb852f8b97a9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 10 12:13:52 crc kubenswrapper[4852]: I1210 12:13:52.748044 4852 generic.go:334] "Generic (PLEG): container finished" podID="8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" containerID="22ecf46a2bcdff4b0e33d8ff9c3cf00f444cde102496f3e4cc42c286b898cac4" exitCode=0 Dec 10 12:13:52 crc kubenswrapper[4852]: I1210 12:13:52.748146 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qwzww" event={"ID":"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c","Type":"ContainerDied","Data":"22ecf46a2bcdff4b0e33d8ff9c3cf00f444cde102496f3e4cc42c286b898cac4"} Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.095405 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.182150 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-db-sync-config-data\") pod \"71453d1b-e7a6-44a3-a449-b1c10eb76997\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.182250 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-combined-ca-bundle\") pod \"71453d1b-e7a6-44a3-a449-b1c10eb76997\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.182386 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpfpb\" (UniqueName: \"kubernetes.io/projected/71453d1b-e7a6-44a3-a449-b1c10eb76997-kube-api-access-dpfpb\") pod \"71453d1b-e7a6-44a3-a449-b1c10eb76997\" (UID: \"71453d1b-e7a6-44a3-a449-b1c10eb76997\") " Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.192355 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "71453d1b-e7a6-44a3-a449-b1c10eb76997" (UID: "71453d1b-e7a6-44a3-a449-b1c10eb76997"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.194423 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71453d1b-e7a6-44a3-a449-b1c10eb76997-kube-api-access-dpfpb" (OuterVolumeSpecName: "kube-api-access-dpfpb") pod "71453d1b-e7a6-44a3-a449-b1c10eb76997" (UID: "71453d1b-e7a6-44a3-a449-b1c10eb76997"). InnerVolumeSpecName "kube-api-access-dpfpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.284960 4852 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.284992 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpfpb\" (UniqueName: \"kubernetes.io/projected/71453d1b-e7a6-44a3-a449-b1c10eb76997-kube-api-access-dpfpb\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.290141 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71453d1b-e7a6-44a3-a449-b1c10eb76997" (UID: "71453d1b-e7a6-44a3-a449-b1c10eb76997"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.387268 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71453d1b-e7a6-44a3-a449-b1c10eb76997-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.802062 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mwvfs" event={"ID":"71453d1b-e7a6-44a3-a449-b1c10eb76997","Type":"ContainerDied","Data":"8b1e91142767259115ece3aa2dcc77ceefe6ea6de2a28a3c7da1ba3668e801fa"} Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.802293 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1e91142767259115ece3aa2dcc77ceefe6ea6de2a28a3c7da1ba3668e801fa" Dec 10 12:13:54 crc kubenswrapper[4852]: I1210 12:13:54.802371 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mwvfs" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.041423 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.118165 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l4k5p"] Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.118466 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerName="dnsmasq-dns" containerID="cri-o://fbaeecc7a4f86216c08b7503d4123c044867a74eaa0a9b490920f6f46493594e" gracePeriod=10 Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.665499 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-db94ccfb7-vvhtv"] Dec 10 12:13:55 crc kubenswrapper[4852]: E1210 12:13:55.687931 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71453d1b-e7a6-44a3-a449-b1c10eb76997" containerName="barbican-db-sync" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.687965 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="71453d1b-e7a6-44a3-a449-b1c10eb76997" containerName="barbican-db-sync" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.688188 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="71453d1b-e7a6-44a3-a449-b1c10eb76997" containerName="barbican-db-sync" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.689390 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.692156 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.692450 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.692672 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bslbz" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.710053 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-db94ccfb7-vvhtv"] Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.711961 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.716634 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc13355d-4438-440e-bfdf-debe0d6dae5b-logs\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.716675 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ngb\" (UniqueName: \"kubernetes.io/projected/cc13355d-4438-440e-bfdf-debe0d6dae5b-kube-api-access-z7ngb\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.716717 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-combined-ca-bundle\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.716765 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-config-data\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.716790 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-config-data-custom\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.750665 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-ffd755b9d-ffwqf"] Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.752707 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.771969 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.786621 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-ffd755b9d-ffwqf"] Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819281 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc13355d-4438-440e-bfdf-debe0d6dae5b-logs\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819330 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7ngb\" (UniqueName: \"kubernetes.io/projected/cc13355d-4438-440e-bfdf-debe0d6dae5b-kube-api-access-z7ngb\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819370 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67165a10-114d-48b8-9c9b-ce7525e7d98d-logs\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819396 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-combined-ca-bundle\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819442 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rwl\" (UniqueName: \"kubernetes.io/projected/67165a10-114d-48b8-9c9b-ce7525e7d98d-kube-api-access-w8rwl\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819468 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-config-data\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819485 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-config-data-custom\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819511 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-config-data-custom\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819533 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-combined-ca-bundle\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819573 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-config-data\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.819993 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc13355d-4438-440e-bfdf-debe0d6dae5b-logs\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.829206 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-config-data-custom\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.834052 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-combined-ca-bundle\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.834288 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc13355d-4438-440e-bfdf-debe0d6dae5b-config-data\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.836398 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-274bh"] Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.837847 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.850403 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-274bh"] Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.853922 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7ngb\" (UniqueName: \"kubernetes.io/projected/cc13355d-4438-440e-bfdf-debe0d6dae5b-kube-api-access-z7ngb\") pod \"barbican-worker-db94ccfb7-vvhtv\" (UID: \"cc13355d-4438-440e-bfdf-debe0d6dae5b\") " pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920100 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920162 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67165a10-114d-48b8-9c9b-ce7525e7d98d-logs\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920212 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-config\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920249 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rwl\" (UniqueName: \"kubernetes.io/projected/67165a10-114d-48b8-9c9b-ce7525e7d98d-kube-api-access-w8rwl\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920269 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh5dn\" (UniqueName: \"kubernetes.io/projected/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-kube-api-access-mh5dn\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920293 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-config-data-custom\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920430 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-combined-ca-bundle\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920484 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920522 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920606 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-config-data\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920629 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.920674 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67165a10-114d-48b8-9c9b-ce7525e7d98d-logs\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.934444 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-config-data-custom\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.943208 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-combined-ca-bundle\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.950631 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67165a10-114d-48b8-9c9b-ce7525e7d98d-config-data\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.956520 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5bbcc6b494-qbdnt"] Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.962666 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.965007 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.969714 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rwl\" (UniqueName: \"kubernetes.io/projected/67165a10-114d-48b8-9c9b-ce7525e7d98d-kube-api-access-w8rwl\") pod \"barbican-keystone-listener-ffd755b9d-ffwqf\" (UID: \"67165a10-114d-48b8-9c9b-ce7525e7d98d\") " pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:55 crc kubenswrapper[4852]: I1210 12:13:55.978398 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bbcc6b494-qbdnt"] Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.018592 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-db94ccfb7-vvhtv" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.021636 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.023596 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e7a6d2e-da38-4efd-b572-0fb131ccae60-logs\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.023522 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.023772 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-combined-ca-bundle\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.023796 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.025045 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.023873 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz9lt\" (UniqueName: \"kubernetes.io/projected/7e7a6d2e-da38-4efd-b572-0fb131ccae60-kube-api-access-wz9lt\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.025169 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data-custom\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.025215 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.025327 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.025389 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-config\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.025416 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh5dn\" (UniqueName: \"kubernetes.io/projected/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-kube-api-access-mh5dn\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.025482 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.026023 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.026531 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.027152 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-config\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.062146 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh5dn\" (UniqueName: \"kubernetes.io/projected/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-kube-api-access-mh5dn\") pod \"dnsmasq-dns-848cf88cfc-274bh\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.106890 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.126981 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e7a6d2e-da38-4efd-b572-0fb131ccae60-logs\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.127124 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-combined-ca-bundle\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.127160 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz9lt\" (UniqueName: \"kubernetes.io/projected/7e7a6d2e-da38-4efd-b572-0fb131ccae60-kube-api-access-wz9lt\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.127181 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data-custom\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.127293 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.326574 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e7a6d2e-da38-4efd-b572-0fb131ccae60-logs\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.329921 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.330886 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-combined-ca-bundle\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.331873 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz9lt\" (UniqueName: \"kubernetes.io/projected/7e7a6d2e-da38-4efd-b572-0fb131ccae60-kube-api-access-wz9lt\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.332083 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data-custom\") pod \"barbican-api-5bbcc6b494-qbdnt\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.349370 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.627839 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.999280 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 10 12:13:56 crc kubenswrapper[4852]: I1210 12:13:56.999344 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 10 12:13:57 crc kubenswrapper[4852]: I1210 12:13:57.031991 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 10 12:13:57 crc kubenswrapper[4852]: I1210 12:13:57.060685 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 10 12:13:57 crc kubenswrapper[4852]: I1210 12:13:57.868463 4852 generic.go:334] "Generic (PLEG): container finished" podID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerID="fbaeecc7a4f86216c08b7503d4123c044867a74eaa0a9b490920f6f46493594e" exitCode=0 Dec 10 12:13:57 crc kubenswrapper[4852]: I1210 12:13:57.868563 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" event={"ID":"b9ace072-49c8-4747-b7d5-1f6f01393c41","Type":"ContainerDied","Data":"fbaeecc7a4f86216c08b7503d4123c044867a74eaa0a9b490920f6f46493594e"} Dec 10 12:13:57 crc kubenswrapper[4852]: I1210 12:13:57.875996 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 10 12:13:57 crc kubenswrapper[4852]: I1210 12:13:57.876037 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.246904 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b6dbfcbc8-pkb6j"] Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.252433 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.256291 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.256731 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.269721 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b6dbfcbc8-pkb6j"] Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.373223 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-config-data-custom\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.373380 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-internal-tls-certs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.373404 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-config-data\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.373436 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zpj\" (UniqueName: \"kubernetes.io/projected/8c77ea6d-6206-4361-8b0f-e8f273666084-kube-api-access-d9zpj\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.373476 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c77ea6d-6206-4361-8b0f-e8f273666084-logs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.373511 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-combined-ca-bundle\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.373607 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-public-tls-certs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.475772 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-public-tls-certs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.475816 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-config-data-custom\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.475876 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-internal-tls-certs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.475900 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-config-data\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.475937 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zpj\" (UniqueName: \"kubernetes.io/projected/8c77ea6d-6206-4361-8b0f-e8f273666084-kube-api-access-d9zpj\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.475981 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c77ea6d-6206-4361-8b0f-e8f273666084-logs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.476019 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-combined-ca-bundle\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.477078 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c77ea6d-6206-4361-8b0f-e8f273666084-logs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.486471 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-combined-ca-bundle\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.486662 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-internal-tls-certs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.486761 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-public-tls-certs\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.487059 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-config-data\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.487504 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c77ea6d-6206-4361-8b0f-e8f273666084-config-data-custom\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.503268 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zpj\" (UniqueName: \"kubernetes.io/projected/8c77ea6d-6206-4361-8b0f-e8f273666084-kube-api-access-d9zpj\") pod \"barbican-api-b6dbfcbc8-pkb6j\" (UID: \"8c77ea6d-6206-4361-8b0f-e8f273666084\") " pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:13:58 crc kubenswrapper[4852]: I1210 12:13:58.578493 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.077500 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.077928 4852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.079774 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.578384 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qwzww" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.727186 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-scripts\") pod \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.727890 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-db-sync-config-data\") pod \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.727949 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-etc-machine-id\") pod \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.728059 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-combined-ca-bundle\") pod \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.728188 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wsdq\" (UniqueName: \"kubernetes.io/projected/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-kube-api-access-8wsdq\") pod \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.728218 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-config-data\") pod \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\" (UID: \"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c\") " Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.733164 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" (UID: "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.777777 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-kube-api-access-8wsdq" (OuterVolumeSpecName: "kube-api-access-8wsdq") pod "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" (UID: "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c"). InnerVolumeSpecName "kube-api-access-8wsdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.778053 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-scripts" (OuterVolumeSpecName: "scripts") pod "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" (UID: "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.785440 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" (UID: "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.816084 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" (UID: "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.833424 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.833464 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wsdq\" (UniqueName: \"kubernetes.io/projected/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-kube-api-access-8wsdq\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.833479 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.833491 4852 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.833503 4852 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.864212 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-config-data" (OuterVolumeSpecName: "config-data") pod "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" (UID: "8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.912022 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qwzww" event={"ID":"8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c","Type":"ContainerDied","Data":"203d83cac456628bfde78fb2b5daf30479be5e86ba1441d7fae50722c414ac03"} Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.912079 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="203d83cac456628bfde78fb2b5daf30479be5e86ba1441d7fae50722c414ac03" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.912143 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qwzww" Dec 10 12:14:00 crc kubenswrapper[4852]: I1210 12:14:00.936509 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:00 crc kubenswrapper[4852]: E1210 12:14:00.996542 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.061050 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.139913 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-svc\") pod \"b9ace072-49c8-4747-b7d5-1f6f01393c41\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.140069 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7w9b\" (UniqueName: \"kubernetes.io/projected/b9ace072-49c8-4747-b7d5-1f6f01393c41-kube-api-access-r7w9b\") pod \"b9ace072-49c8-4747-b7d5-1f6f01393c41\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.140126 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-config\") pod \"b9ace072-49c8-4747-b7d5-1f6f01393c41\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.140162 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-swift-storage-0\") pod \"b9ace072-49c8-4747-b7d5-1f6f01393c41\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.140254 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-sb\") pod \"b9ace072-49c8-4747-b7d5-1f6f01393c41\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.140290 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-nb\") pod \"b9ace072-49c8-4747-b7d5-1f6f01393c41\" (UID: \"b9ace072-49c8-4747-b7d5-1f6f01393c41\") " Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.148466 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ace072-49c8-4747-b7d5-1f6f01393c41-kube-api-access-r7w9b" (OuterVolumeSpecName: "kube-api-access-r7w9b") pod "b9ace072-49c8-4747-b7d5-1f6f01393c41" (UID: "b9ace072-49c8-4747-b7d5-1f6f01393c41"). InnerVolumeSpecName "kube-api-access-r7w9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.200256 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-config" (OuterVolumeSpecName: "config") pod "b9ace072-49c8-4747-b7d5-1f6f01393c41" (UID: "b9ace072-49c8-4747-b7d5-1f6f01393c41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.208252 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9ace072-49c8-4747-b7d5-1f6f01393c41" (UID: "b9ace072-49c8-4747-b7d5-1f6f01393c41"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.214790 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b9ace072-49c8-4747-b7d5-1f6f01393c41" (UID: "b9ace072-49c8-4747-b7d5-1f6f01393c41"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.217770 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b9ace072-49c8-4747-b7d5-1f6f01393c41" (UID: "b9ace072-49c8-4747-b7d5-1f6f01393c41"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.243040 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.243387 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7w9b\" (UniqueName: \"kubernetes.io/projected/b9ace072-49c8-4747-b7d5-1f6f01393c41-kube-api-access-r7w9b\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.243456 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.243527 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.243580 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.246106 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b9ace072-49c8-4747-b7d5-1f6f01393c41" (UID: "b9ace072-49c8-4747-b7d5-1f6f01393c41"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.270104 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-db94ccfb7-vvhtv"] Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.323283 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-274bh"] Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.340090 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-ffd755b9d-ffwqf"] Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.347716 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9ace072-49c8-4747-b7d5-1f6f01393c41-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.463535 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bbcc6b494-qbdnt"] Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.474402 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b6dbfcbc8-pkb6j"] Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.972486 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:01 crc kubenswrapper[4852]: E1210 12:14:01.973450 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" containerName="cinder-db-sync" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.973463 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" containerName="cinder-db-sync" Dec 10 12:14:01 crc kubenswrapper[4852]: E1210 12:14:01.973480 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerName="init" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.973486 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerName="init" Dec 10 12:14:01 crc kubenswrapper[4852]: E1210 12:14:01.973496 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerName="dnsmasq-dns" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.973502 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerName="dnsmasq-dns" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.973673 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerName="dnsmasq-dns" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.973694 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" containerName="cinder-db-sync" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.974748 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.977128 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" event={"ID":"67165a10-114d-48b8-9c9b-ce7525e7d98d","Type":"ContainerStarted","Data":"634db974ef5914b3a43524cbd80b996612abe5c8b1e099a9787b6946bbf5f948"} Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.986090 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zdp77" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.986616 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.989746 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 10 12:14:01 crc kubenswrapper[4852]: I1210 12:14:01.995691 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.005187 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbcc6b494-qbdnt" event={"ID":"7e7a6d2e-da38-4efd-b572-0fb131ccae60","Type":"ContainerStarted","Data":"26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.005244 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbcc6b494-qbdnt" event={"ID":"7e7a6d2e-da38-4efd-b572-0fb131ccae60","Type":"ContainerStarted","Data":"52ff8b5fa3b875b80245e2d1543c46bc4faf045d111c2d24d960f744faed3179"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.027533 4852 generic.go:334] "Generic (PLEG): container finished" podID="fd5af705-c0a6-4dc6-9d6a-55ff6e610577" containerID="34a614323cbbea07f9aed7f8404af4621fd4c7a2cc51b769e03ed673541bb898" exitCode=0 Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.027622 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-274bh" event={"ID":"fd5af705-c0a6-4dc6-9d6a-55ff6e610577","Type":"ContainerDied","Data":"34a614323cbbea07f9aed7f8404af4621fd4c7a2cc51b769e03ed673541bb898"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.027648 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-274bh" event={"ID":"fd5af705-c0a6-4dc6-9d6a-55ff6e610577","Type":"ContainerStarted","Data":"26e00c1547b65a76107edd09928037f3b526945e1f61d1d7edf5106f5c330f35"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.068483 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp7bv\" (UniqueName: \"kubernetes.io/projected/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-kube-api-access-gp7bv\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.068537 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.068583 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.068629 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.068653 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.068697 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.072119 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02730e24-a11e-4c7b-9470-9290b251bcb9","Type":"ContainerStarted","Data":"3bad321407c992a73477bb6a8e5badb4c7da2f63ffcf91a4cc2479027dfd0336"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.072365 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.072362 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="proxy-httpd" containerID="cri-o://3bad321407c992a73477bb6a8e5badb4c7da2f63ffcf91a4cc2479027dfd0336" gracePeriod=30 Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.072396 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="sg-core" containerID="cri-o://f58c4ff28c6889725dde23eb4f974a70d19c67989fea7548a02e74df19d045f2" gracePeriod=30 Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.072598 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="ceilometer-notification-agent" containerID="cri-o://5b3c265eac126455f9541d52bf3d5da25b3b2ffcd55c59ecbfae2633f19d7cee" gracePeriod=30 Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.080822 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.101516 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db94ccfb7-vvhtv" event={"ID":"cc13355d-4438-440e-bfdf-debe0d6dae5b","Type":"ContainerStarted","Data":"9482a7244985c99cfed740c0a3302a32267e5c4f0af1ac314adb600b9d0d93cb"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.116693 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" event={"ID":"b9ace072-49c8-4747-b7d5-1f6f01393c41","Type":"ContainerDied","Data":"5e9d1e0bc1a45a21e8db4ef09416f28370ed53d92887169f42f216cfe54d11cc"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.116748 4852 scope.go:117] "RemoveContainer" containerID="fbaeecc7a4f86216c08b7503d4123c044867a74eaa0a9b490920f6f46493594e" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.116872 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.171200 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp7bv\" (UniqueName: \"kubernetes.io/projected/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-kube-api-access-gp7bv\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.171304 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.171376 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.171443 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.171486 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.171557 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.174297 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-274bh"] Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.184102 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.184176 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.201798 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.208623 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.209513 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.232881 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp7bv\" (UniqueName: \"kubernetes.io/projected/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-kube-api-access-gp7bv\") pod \"cinder-scheduler-0\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.343718 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.415876 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" event={"ID":"8c77ea6d-6206-4361-8b0f-e8f273666084","Type":"ContainerStarted","Data":"d91bc9a89c69f1b507509309563b320afb105b0b2fc1deec0e2c235fd4266b93"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.415924 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" event={"ID":"8c77ea6d-6206-4361-8b0f-e8f273666084","Type":"ContainerStarted","Data":"bb9aff518f5a494afd7472f0355ed55f61e58a564ae74bd74477ff6de32aa523"} Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.415955 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dh4tm"] Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.418584 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dh4tm"] Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.418624 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.419390 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.420691 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.422713 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.426368 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481148 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-config\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481258 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481323 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481410 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data-custom\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481499 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481526 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-scripts\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481591 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481608 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06ad202-e036-4755-8869-d336483b8791-logs\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481661 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06ad202-e036-4755-8869-d336483b8791-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481732 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-svc\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.481831 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.482622 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhhrl\" (UniqueName: \"kubernetes.io/projected/d06ad202-e036-4755-8869-d336483b8791-kube-api-access-xhhrl\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.482766 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5dz4\" (UniqueName: \"kubernetes.io/projected/6792d3ff-5d80-410e-98c4-57dc79836a58-kube-api-access-g5dz4\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.580643 4852 scope.go:117] "RemoveContainer" containerID="a2c74957f435247da30c451f17c0da080de3ceff932d1512b66ec724898ba694" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584113 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5dz4\" (UniqueName: \"kubernetes.io/projected/6792d3ff-5d80-410e-98c4-57dc79836a58-kube-api-access-g5dz4\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584170 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-config\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584199 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584253 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584288 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data-custom\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584359 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584388 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-scripts\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584418 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584441 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06ad202-e036-4755-8869-d336483b8791-logs\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584466 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06ad202-e036-4755-8869-d336483b8791-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584492 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-svc\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584520 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.584553 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhhrl\" (UniqueName: \"kubernetes.io/projected/d06ad202-e036-4755-8869-d336483b8791-kube-api-access-xhhrl\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.585459 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06ad202-e036-4755-8869-d336483b8791-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.585608 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-config\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.585691 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.585851 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06ad202-e036-4755-8869-d336483b8791-logs\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.586418 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.590961 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.593217 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-svc\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.607522 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-scripts\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.612100 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.640603 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l4k5p"] Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.640970 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l4k5p"] Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.672837 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5dz4\" (UniqueName: \"kubernetes.io/projected/6792d3ff-5d80-410e-98c4-57dc79836a58-kube-api-access-g5dz4\") pod \"dnsmasq-dns-6578955fd5-dh4tm\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.673306 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.681213 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhhrl\" (UniqueName: \"kubernetes.io/projected/d06ad202-e036-4755-8869-d336483b8791-kube-api-access-xhhrl\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.748036 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data-custom\") pod \"cinder-api-0\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.762639 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:02 crc kubenswrapper[4852]: I1210 12:14:02.822120 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 10 12:14:02 crc kubenswrapper[4852]: E1210 12:14:02.932092 4852 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 10 12:14:02 crc kubenswrapper[4852]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/fd5af705-c0a6-4dc6-9d6a-55ff6e610577/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 10 12:14:02 crc kubenswrapper[4852]: > podSandboxID="26e00c1547b65a76107edd09928037f3b526945e1f61d1d7edf5106f5c330f35" Dec 10 12:14:02 crc kubenswrapper[4852]: E1210 12:14:02.932293 4852 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 10 12:14:02 crc kubenswrapper[4852]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n99h8bhd9h696h649h588h5c6h658h5b4h57fh65h89h5f5h56h696h5dh8h57h597h68ch568h58dh66hf4h675h598h588h67dhb5h69h5dh6bq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mh5dn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-848cf88cfc-274bh_openstack(fd5af705-c0a6-4dc6-9d6a-55ff6e610577): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/fd5af705-c0a6-4dc6-9d6a-55ff6e610577/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 10 12:14:02 crc kubenswrapper[4852]: > logger="UnhandledError" Dec 10 12:14:02 crc kubenswrapper[4852]: E1210 12:14:02.934333 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/fd5af705-c0a6-4dc6-9d6a-55ff6e610577/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-848cf88cfc-274bh" podUID="fd5af705-c0a6-4dc6-9d6a-55ff6e610577" Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.036167 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.215220 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" event={"ID":"8c77ea6d-6206-4361-8b0f-e8f273666084","Type":"ContainerStarted","Data":"eea05b051f33e1128acd62de376c921cc22764f1c46247a767ffe7b7a158e9c3"} Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.216420 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.216448 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.218700 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7","Type":"ContainerStarted","Data":"7b90bb51d036bb51d801098eca1af70f233c7d07875371063e9c8ec8fcebd9da"} Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.221382 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbcc6b494-qbdnt" event={"ID":"7e7a6d2e-da38-4efd-b572-0fb131ccae60","Type":"ContainerStarted","Data":"d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b"} Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.222458 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.222523 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.240521 4852 generic.go:334] "Generic (PLEG): container finished" podID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerID="3bad321407c992a73477bb6a8e5badb4c7da2f63ffcf91a4cc2479027dfd0336" exitCode=0 Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.240559 4852 generic.go:334] "Generic (PLEG): container finished" podID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerID="f58c4ff28c6889725dde23eb4f974a70d19c67989fea7548a02e74df19d045f2" exitCode=2 Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.240877 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02730e24-a11e-4c7b-9470-9290b251bcb9","Type":"ContainerDied","Data":"3bad321407c992a73477bb6a8e5badb4c7da2f63ffcf91a4cc2479027dfd0336"} Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.240912 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02730e24-a11e-4c7b-9470-9290b251bcb9","Type":"ContainerDied","Data":"f58c4ff28c6889725dde23eb4f974a70d19c67989fea7548a02e74df19d045f2"} Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.251288 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" podStartSLOduration=5.251267589 podStartE2EDuration="5.251267589s" podCreationTimestamp="2025-12-10 12:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:03.236722476 +0000 UTC m=+1329.322247710" watchObservedRunningTime="2025-12-10 12:14:03.251267589 +0000 UTC m=+1329.336792823" Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.286755 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podStartSLOduration=8.286734335 podStartE2EDuration="8.286734335s" podCreationTimestamp="2025-12-10 12:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:03.268402167 +0000 UTC m=+1329.353927411" watchObservedRunningTime="2025-12-10 12:14:03.286734335 +0000 UTC m=+1329.372259569" Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.577457 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dh4tm"] Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.696258 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.861972 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.968558 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-config\") pod \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.968636 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-swift-storage-0\") pod \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.968746 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-svc\") pod \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.968780 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh5dn\" (UniqueName: \"kubernetes.io/projected/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-kube-api-access-mh5dn\") pod \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.968852 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-nb\") pod \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.968962 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-sb\") pod \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\" (UID: \"fd5af705-c0a6-4dc6-9d6a-55ff6e610577\") " Dec 10 12:14:03 crc kubenswrapper[4852]: I1210 12:14:03.975859 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-kube-api-access-mh5dn" (OuterVolumeSpecName: "kube-api-access-mh5dn") pod "fd5af705-c0a6-4dc6-9d6a-55ff6e610577" (UID: "fd5af705-c0a6-4dc6-9d6a-55ff6e610577"). InnerVolumeSpecName "kube-api-access-mh5dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.035608 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-config" (OuterVolumeSpecName: "config") pod "fd5af705-c0a6-4dc6-9d6a-55ff6e610577" (UID: "fd5af705-c0a6-4dc6-9d6a-55ff6e610577"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.039967 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fd5af705-c0a6-4dc6-9d6a-55ff6e610577" (UID: "fd5af705-c0a6-4dc6-9d6a-55ff6e610577"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.049751 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd5af705-c0a6-4dc6-9d6a-55ff6e610577" (UID: "fd5af705-c0a6-4dc6-9d6a-55ff6e610577"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.058307 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd5af705-c0a6-4dc6-9d6a-55ff6e610577" (UID: "fd5af705-c0a6-4dc6-9d6a-55ff6e610577"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.060056 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd5af705-c0a6-4dc6-9d6a-55ff6e610577" (UID: "fd5af705-c0a6-4dc6-9d6a-55ff6e610577"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.071668 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.071708 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh5dn\" (UniqueName: \"kubernetes.io/projected/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-kube-api-access-mh5dn\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.071724 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.071736 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.071746 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.071758 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd5af705-c0a6-4dc6-9d6a-55ff6e610577-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.188594 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" path="/var/lib/kubelet/pods/b9ace072-49c8-4747-b7d5-1f6f01393c41/volumes" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.265215 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" event={"ID":"6792d3ff-5d80-410e-98c4-57dc79836a58","Type":"ContainerStarted","Data":"89d931cfd44a4b6eb8f561dd996763015caf3d24f05642e6606c0ddd3db8a439"} Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.268434 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d06ad202-e036-4755-8869-d336483b8791","Type":"ContainerStarted","Data":"e0276610495377dc07a73e5dc5135f0d9b2f6481ee82d830db1a7333e998d6a3"} Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.271278 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-274bh" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.272287 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-274bh" event={"ID":"fd5af705-c0a6-4dc6-9d6a-55ff6e610577","Type":"ContainerDied","Data":"26e00c1547b65a76107edd09928037f3b526945e1f61d1d7edf5106f5c330f35"} Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.272337 4852 scope.go:117] "RemoveContainer" containerID="34a614323cbbea07f9aed7f8404af4621fd4c7a2cc51b769e03ed673541bb898" Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.356436 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-274bh"] Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.377995 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-274bh"] Dec 10 12:14:04 crc kubenswrapper[4852]: I1210 12:14:04.727059 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:05 crc kubenswrapper[4852]: W1210 12:14:05.235834 4852 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5af705_c0a6_4dc6_9d6a_55ff6e610577.slice/crio-34a614323cbbea07f9aed7f8404af4621fd4c7a2cc51b769e03ed673541bb898.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5af705_c0a6_4dc6_9d6a_55ff6e610577.slice/crio-34a614323cbbea07f9aed7f8404af4621fd4c7a2cc51b769e03ed673541bb898.scope: no such file or directory Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.238958 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:14:05 crc kubenswrapper[4852]: W1210 12:14:05.252020 4852 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5af705_c0a6_4dc6_9d6a_55ff6e610577.slice/crio-conmon-4683e5077de0cb50ad52facc2165134b0fa73c6e3112825bee669ada9156c632.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5af705_c0a6_4dc6_9d6a_55ff6e610577.slice/crio-conmon-4683e5077de0cb50ad52facc2165134b0fa73c6e3112825bee669ada9156c632.scope: no such file or directory Dec 10 12:14:05 crc kubenswrapper[4852]: W1210 12:14:05.252071 4852 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5af705_c0a6_4dc6_9d6a_55ff6e610577.slice/crio-4683e5077de0cb50ad52facc2165134b0fa73c6e3112825bee669ada9156c632.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5af705_c0a6_4dc6_9d6a_55ff6e610577.slice/crio-4683e5077de0cb50ad52facc2165134b0fa73c6e3112825bee669ada9156c632.scope: no such file or directory Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.347436 4852 generic.go:334] "Generic (PLEG): container finished" podID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerID="b206a9b1d279bf59e81117c94f11c7f407bae80d59916a83bea99e87a869fc46" exitCode=0 Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.347514 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" event={"ID":"6792d3ff-5d80-410e-98c4-57dc79836a58","Type":"ContainerDied","Data":"b206a9b1d279bf59e81117c94f11c7f407bae80d59916a83bea99e87a869fc46"} Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.349506 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d06ad202-e036-4755-8869-d336483b8791","Type":"ContainerStarted","Data":"26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2"} Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.355578 4852 generic.go:334] "Generic (PLEG): container finished" podID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerID="b7897cdb820a26ec505ea7bfe1fd030efbd31ff4ee3fcb3391d5d68fe6f8a59b" exitCode=137 Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.355619 4852 generic.go:334] "Generic (PLEG): container finished" podID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerID="947130cad1a59ee571cede0e13da15dd7fa9df8f7d910484092d50a1fbee5aac" exitCode=137 Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.355643 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c94757ccc-jfgtq" event={"ID":"fdd39a7d-3499-4846-82b1-452bd627dd23","Type":"ContainerDied","Data":"b7897cdb820a26ec505ea7bfe1fd030efbd31ff4ee3fcb3391d5d68fe6f8a59b"} Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.355682 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c94757ccc-jfgtq" event={"ID":"fdd39a7d-3499-4846-82b1-452bd627dd23","Type":"ContainerDied","Data":"947130cad1a59ee571cede0e13da15dd7fa9df8f7d910484092d50a1fbee5aac"} Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.447426 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:14:05 crc kubenswrapper[4852]: E1210 12:14:05.562847 4852 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9ace072_49c8_4747_b7d5_1f6f01393c41.slice/crio-5e9d1e0bc1a45a21e8db4ef09416f28370ed53d92887169f42f216cfe54d11cc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdd39a7d_3499_4846_82b1_452bd627dd23.slice/crio-conmon-947130cad1a59ee571cede0e13da15dd7fa9df8f7d910484092d50a1fbee5aac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdd39a7d_3499_4846_82b1_452bd627dd23.slice/crio-b7897cdb820a26ec505ea7bfe1fd030efbd31ff4ee3fcb3391d5d68fe6f8a59b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02730e24_a11e_4c7b_9470_9290b251bcb9.slice/crio-conmon-3bad321407c992a73477bb6a8e5badb4c7da2f63ffcf91a4cc2479027dfd0336.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5af705_c0a6_4dc6_9d6a_55ff6e610577.slice/crio-26e00c1547b65a76107edd09928037f3b526945e1f61d1d7edf5106f5c330f35\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02730e24_a11e_4c7b_9470_9290b251bcb9.slice/crio-3bad321407c992a73477bb6a8e5badb4c7da2f63ffcf91a4cc2479027dfd0336.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9ace072_49c8_4747_b7d5_1f6f01393c41.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdd39a7d_3499_4846_82b1_452bd627dd23.slice/crio-conmon-b7897cdb820a26ec505ea7bfe1fd030efbd31ff4ee3fcb3391d5d68fe6f8a59b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdd39a7d_3499_4846_82b1_452bd627dd23.slice/crio-947130cad1a59ee571cede0e13da15dd7fa9df8f7d910484092d50a1fbee5aac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5af705_c0a6_4dc6_9d6a_55ff6e610577.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02730e24_a11e_4c7b_9470_9290b251bcb9.slice/crio-f58c4ff28c6889725dde23eb4f974a70d19c67989fea7548a02e74df19d045f2.scope\": RecentStats: unable to find data in memory cache]" Dec 10 12:14:05 crc kubenswrapper[4852]: I1210 12:14:05.710527 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-l4k5p" podUID="b9ace072-49c8-4747-b7d5-1f6f01393c41" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Dec 10 12:14:06 crc kubenswrapper[4852]: I1210 12:14:06.183570 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5af705-c0a6-4dc6-9d6a-55ff6e610577" path="/var/lib/kubelet/pods/fd5af705-c0a6-4dc6-9d6a-55ff6e610577/volumes" Dec 10 12:14:07 crc kubenswrapper[4852]: I1210 12:14:07.187366 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:14:07 crc kubenswrapper[4852]: I1210 12:14:07.347975 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-cc955f7d4-bclr7" Dec 10 12:14:07 crc kubenswrapper[4852]: I1210 12:14:07.379667 4852 generic.go:334] "Generic (PLEG): container finished" podID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerID="5b3c265eac126455f9541d52bf3d5da25b3b2ffcd55c59ecbfae2633f19d7cee" exitCode=0 Dec 10 12:14:07 crc kubenswrapper[4852]: I1210 12:14:07.379721 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02730e24-a11e-4c7b-9470-9290b251bcb9","Type":"ContainerDied","Data":"5b3c265eac126455f9541d52bf3d5da25b3b2ffcd55c59ecbfae2633f19d7cee"} Dec 10 12:14:07 crc kubenswrapper[4852]: I1210 12:14:07.423088 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-955f9866d-84pn5"] Dec 10 12:14:07 crc kubenswrapper[4852]: I1210 12:14:07.423639 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-955f9866d-84pn5" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon-log" containerID="cri-o://cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f" gracePeriod=30 Dec 10 12:14:07 crc kubenswrapper[4852]: I1210 12:14:07.423776 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-955f9866d-84pn5" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon" containerID="cri-o://7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893" gracePeriod=30 Dec 10 12:14:08 crc kubenswrapper[4852]: I1210 12:14:08.768646 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.351115 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.413702 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnzd2\" (UniqueName: \"kubernetes.io/projected/02730e24-a11e-4c7b-9470-9290b251bcb9-kube-api-access-dnzd2\") pod \"02730e24-a11e-4c7b-9470-9290b251bcb9\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.413781 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-scripts\") pod \"02730e24-a11e-4c7b-9470-9290b251bcb9\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.413802 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-sg-core-conf-yaml\") pod \"02730e24-a11e-4c7b-9470-9290b251bcb9\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.413831 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-run-httpd\") pod \"02730e24-a11e-4c7b-9470-9290b251bcb9\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.413853 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-combined-ca-bundle\") pod \"02730e24-a11e-4c7b-9470-9290b251bcb9\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.413898 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-log-httpd\") pod \"02730e24-a11e-4c7b-9470-9290b251bcb9\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.413930 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-config-data\") pod \"02730e24-a11e-4c7b-9470-9290b251bcb9\" (UID: \"02730e24-a11e-4c7b-9470-9290b251bcb9\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.421351 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "02730e24-a11e-4c7b-9470-9290b251bcb9" (UID: "02730e24-a11e-4c7b-9470-9290b251bcb9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.421640 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "02730e24-a11e-4c7b-9470-9290b251bcb9" (UID: "02730e24-a11e-4c7b-9470-9290b251bcb9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.421808 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02730e24-a11e-4c7b-9470-9290b251bcb9-kube-api-access-dnzd2" (OuterVolumeSpecName: "kube-api-access-dnzd2") pod "02730e24-a11e-4c7b-9470-9290b251bcb9" (UID: "02730e24-a11e-4c7b-9470-9290b251bcb9"). InnerVolumeSpecName "kube-api-access-dnzd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.442481 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-scripts" (OuterVolumeSpecName: "scripts") pod "02730e24-a11e-4c7b-9470-9290b251bcb9" (UID: "02730e24-a11e-4c7b-9470-9290b251bcb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.472762 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" event={"ID":"6792d3ff-5d80-410e-98c4-57dc79836a58","Type":"ContainerStarted","Data":"f4ba41145f05a7545fb638ab4bbc41aa422573ad2488530bf8b1c115bdfae972"} Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.473890 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.495936 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d06ad202-e036-4755-8869-d336483b8791","Type":"ContainerStarted","Data":"5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60"} Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.496136 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d06ad202-e036-4755-8869-d336483b8791" containerName="cinder-api-log" containerID="cri-o://26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2" gracePeriod=30 Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.496384 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.496389 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d06ad202-e036-4755-8869-d336483b8791" containerName="cinder-api" containerID="cri-o://5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60" gracePeriod=30 Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.502946 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "02730e24-a11e-4c7b-9470-9290b251bcb9" (UID: "02730e24-a11e-4c7b-9470-9290b251bcb9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.505366 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" podStartSLOduration=7.505345651 podStartE2EDuration="7.505345651s" podCreationTimestamp="2025-12-10 12:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:09.492438828 +0000 UTC m=+1335.577964052" watchObservedRunningTime="2025-12-10 12:14:09.505345651 +0000 UTC m=+1335.590870885" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.515878 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnzd2\" (UniqueName: \"kubernetes.io/projected/02730e24-a11e-4c7b-9470-9290b251bcb9-kube-api-access-dnzd2\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.515914 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.515926 4852 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.515936 4852 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.515946 4852 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02730e24-a11e-4c7b-9470-9290b251bcb9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.526524 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02730e24-a11e-4c7b-9470-9290b251bcb9","Type":"ContainerDied","Data":"9cdef688446cb91b2db69a04d0920b34a65556fe8a0731cc9e278c9bbe28ceae"} Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.526834 4852 scope.go:117] "RemoveContainer" containerID="3bad321407c992a73477bb6a8e5badb4c7da2f63ffcf91a4cc2479027dfd0336" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.527000 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.536128 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.536103259 podStartE2EDuration="7.536103259s" podCreationTimestamp="2025-12-10 12:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:09.521956996 +0000 UTC m=+1335.607482220" watchObservedRunningTime="2025-12-10 12:14:09.536103259 +0000 UTC m=+1335.621628483" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.571709 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02730e24-a11e-4c7b-9470-9290b251bcb9" (UID: "02730e24-a11e-4c7b-9470-9290b251bcb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.610647 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-config-data" (OuterVolumeSpecName: "config-data") pod "02730e24-a11e-4c7b-9470-9290b251bcb9" (UID: "02730e24-a11e-4c7b-9470-9290b251bcb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.618976 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.619022 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02730e24-a11e-4c7b-9470-9290b251bcb9-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.666475 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.719994 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-config-data\") pod \"fdd39a7d-3499-4846-82b1-452bd627dd23\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.720404 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pn5d\" (UniqueName: \"kubernetes.io/projected/fdd39a7d-3499-4846-82b1-452bd627dd23-kube-api-access-4pn5d\") pod \"fdd39a7d-3499-4846-82b1-452bd627dd23\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.720549 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd39a7d-3499-4846-82b1-452bd627dd23-logs\") pod \"fdd39a7d-3499-4846-82b1-452bd627dd23\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.720685 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-scripts\") pod \"fdd39a7d-3499-4846-82b1-452bd627dd23\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.720848 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdd39a7d-3499-4846-82b1-452bd627dd23-horizon-secret-key\") pod \"fdd39a7d-3499-4846-82b1-452bd627dd23\" (UID: \"fdd39a7d-3499-4846-82b1-452bd627dd23\") " Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.921471 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.929254 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.942376 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:09 crc kubenswrapper[4852]: E1210 12:14:09.942952 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="proxy-httpd" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.942971 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="proxy-httpd" Dec 10 12:14:09 crc kubenswrapper[4852]: E1210 12:14:09.942984 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerName="horizon" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.942990 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerName="horizon" Dec 10 12:14:09 crc kubenswrapper[4852]: E1210 12:14:09.943033 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="sg-core" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.943041 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="sg-core" Dec 10 12:14:09 crc kubenswrapper[4852]: E1210 12:14:09.943084 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerName="horizon-log" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.943955 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerName="horizon-log" Dec 10 12:14:09 crc kubenswrapper[4852]: E1210 12:14:09.943992 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5af705-c0a6-4dc6-9d6a-55ff6e610577" containerName="init" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.944001 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5af705-c0a6-4dc6-9d6a-55ff6e610577" containerName="init" Dec 10 12:14:09 crc kubenswrapper[4852]: E1210 12:14:09.944015 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="ceilometer-notification-agent" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.944021 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="ceilometer-notification-agent" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.944456 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="proxy-httpd" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.944560 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="ceilometer-notification-agent" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.944577 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerName="horizon-log" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.944588 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" containerName="horizon" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.944596 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5af705-c0a6-4dc6-9d6a-55ff6e610577" containerName="init" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.944608 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" containerName="sg-core" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.946391 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.949369 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.951504 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.972587 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.975638 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-config-data" (OuterVolumeSpecName: "config-data") pod "fdd39a7d-3499-4846-82b1-452bd627dd23" (UID: "fdd39a7d-3499-4846-82b1-452bd627dd23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.978362 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd39a7d-3499-4846-82b1-452bd627dd23-logs" (OuterVolumeSpecName: "logs") pod "fdd39a7d-3499-4846-82b1-452bd627dd23" (UID: "fdd39a7d-3499-4846-82b1-452bd627dd23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.986803 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-scripts" (OuterVolumeSpecName: "scripts") pod "fdd39a7d-3499-4846-82b1-452bd627dd23" (UID: "fdd39a7d-3499-4846-82b1-452bd627dd23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.990846 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd39a7d-3499-4846-82b1-452bd627dd23-kube-api-access-4pn5d" (OuterVolumeSpecName: "kube-api-access-4pn5d") pod "fdd39a7d-3499-4846-82b1-452bd627dd23" (UID: "fdd39a7d-3499-4846-82b1-452bd627dd23"). InnerVolumeSpecName "kube-api-access-4pn5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:09 crc kubenswrapper[4852]: I1210 12:14:09.994312 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd39a7d-3499-4846-82b1-452bd627dd23-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fdd39a7d-3499-4846-82b1-452bd627dd23" (UID: "fdd39a7d-3499-4846-82b1-452bd627dd23"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.011576 4852 scope.go:117] "RemoveContainer" containerID="f58c4ff28c6889725dde23eb4f974a70d19c67989fea7548a02e74df19d045f2" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.026506 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.026538 4852 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdd39a7d-3499-4846-82b1-452bd627dd23-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.026549 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdd39a7d-3499-4846-82b1-452bd627dd23-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.026559 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pn5d\" (UniqueName: \"kubernetes.io/projected/fdd39a7d-3499-4846-82b1-452bd627dd23-kube-api-access-4pn5d\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.026570 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd39a7d-3499-4846-82b1-452bd627dd23-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.085456 4852 scope.go:117] "RemoveContainer" containerID="5b3c265eac126455f9541d52bf3d5da25b3b2ffcd55c59ecbfae2633f19d7cee" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.129834 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-run-httpd\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.130208 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.130340 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-log-httpd\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.130376 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqp4j\" (UniqueName: \"kubernetes.io/projected/0dead70d-b3b6-4bee-a6cf-5e10924a3877-kube-api-access-mqp4j\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.130399 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-scripts\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.130439 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.130492 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-config-data\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.186301 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02730e24-a11e-4c7b-9470-9290b251bcb9" path="/var/lib/kubelet/pods/02730e24-a11e-4c7b-9470-9290b251bcb9/volumes" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.231627 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-log-httpd\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.231690 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqp4j\" (UniqueName: \"kubernetes.io/projected/0dead70d-b3b6-4bee-a6cf-5e10924a3877-kube-api-access-mqp4j\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.231715 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-scripts\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.231738 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.231787 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-config-data\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.231872 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-run-httpd\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.231918 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.240027 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.244904 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.247068 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-scripts\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.247546 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-log-httpd\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.248770 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-run-httpd\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.253131 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-config-data\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.279179 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqp4j\" (UniqueName: \"kubernetes.io/projected/0dead70d-b3b6-4bee-a6cf-5e10924a3877-kube-api-access-mqp4j\") pod \"ceilometer-0\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.287797 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.407697 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.717270 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db94ccfb7-vvhtv" event={"ID":"cc13355d-4438-440e-bfdf-debe0d6dae5b","Type":"ContainerStarted","Data":"7598d4a6c054e9d40005ec851a4218e5970f1317e34f35fd26d49cdf161d3727"} Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.727079 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" event={"ID":"67165a10-114d-48b8-9c9b-ce7525e7d98d","Type":"ContainerStarted","Data":"655298da971994b993dddb1b9f1ec231e8a064ece5a4e3770f4516093709fd72"} Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.795463 4852 generic.go:334] "Generic (PLEG): container finished" podID="d06ad202-e036-4755-8869-d336483b8791" containerID="26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2" exitCode=143 Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.800023 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d06ad202-e036-4755-8869-d336483b8791","Type":"ContainerDied","Data":"26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2"} Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.851410 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c94757ccc-jfgtq" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.851846 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c94757ccc-jfgtq" event={"ID":"fdd39a7d-3499-4846-82b1-452bd627dd23","Type":"ContainerDied","Data":"f2dd917be3d8fa042acf5c634fa82cdbc02cb741c589466be27689098064dd1d"} Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.851896 4852 scope.go:117] "RemoveContainer" containerID="b7897cdb820a26ec505ea7bfe1fd030efbd31ff4ee3fcb3391d5d68fe6f8a59b" Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.928036 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c94757ccc-jfgtq"] Dec 10 12:14:10 crc kubenswrapper[4852]: I1210 12:14:10.980141 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c94757ccc-jfgtq"] Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.088092 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.156625 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.195819 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.252404 4852 scope.go:117] "RemoveContainer" containerID="947130cad1a59ee571cede0e13da15dd7fa9df8f7d910484092d50a1fbee5aac" Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.500373 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b6dbfcbc8-pkb6j" Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.578686 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bbcc6b494-qbdnt"] Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.731136 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-955f9866d-84pn5" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.910791 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db94ccfb7-vvhtv" event={"ID":"cc13355d-4438-440e-bfdf-debe0d6dae5b","Type":"ContainerStarted","Data":"4872736e0dacfe0eda39725fcd473f5777ab161c3e9d1575dfef37ff9d21a163"} Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.928591 4852 generic.go:334] "Generic (PLEG): container finished" podID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerID="7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893" exitCode=0 Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.928658 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-955f9866d-84pn5" event={"ID":"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952","Type":"ContainerDied","Data":"7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893"} Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.943352 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-db94ccfb7-vvhtv" podStartSLOduration=8.819398455 podStartE2EDuration="16.943333548s" podCreationTimestamp="2025-12-10 12:13:55 +0000 UTC" firstStartedPulling="2025-12-10 12:14:01.276124757 +0000 UTC m=+1327.361649981" lastFinishedPulling="2025-12-10 12:14:09.40005985 +0000 UTC m=+1335.485585074" observedRunningTime="2025-12-10 12:14:11.941774759 +0000 UTC m=+1338.027299983" watchObservedRunningTime="2025-12-10 12:14:11.943333548 +0000 UTC m=+1338.028858772" Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.958803 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" event={"ID":"67165a10-114d-48b8-9c9b-ce7525e7d98d","Type":"ContainerStarted","Data":"9cfa293a3e214b2b474693fc3ac7c8dbbf5b9adf26cd07a1975b7025c82cf42a"} Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.981570 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7","Type":"ContainerStarted","Data":"3715a632bbe77e0d803a2f52e91d6055d3988b3eef335a81e3cd67b1126dd9e3"} Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.996987 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api-log" containerID="cri-o://26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d" gracePeriod=30 Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.997300 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerStarted","Data":"47d1459971d9f200a984e9fccd6aa1e9054430ab6e1abf9f2478f3d4058667d6"} Dec 10 12:14:11 crc kubenswrapper[4852]: I1210 12:14:11.998363 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api" containerID="cri-o://d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b" gracePeriod=30 Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.009398 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-ffd755b9d-ffwqf" podStartSLOduration=9.078292315 podStartE2EDuration="17.009378329s" podCreationTimestamp="2025-12-10 12:13:55 +0000 UTC" firstStartedPulling="2025-12-10 12:14:01.310699171 +0000 UTC m=+1327.396224395" lastFinishedPulling="2025-12-10 12:14:09.241785185 +0000 UTC m=+1335.327310409" observedRunningTime="2025-12-10 12:14:12.00504409 +0000 UTC m=+1338.090569314" watchObservedRunningTime="2025-12-10 12:14:12.009378329 +0000 UTC m=+1338.094903563" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.014187 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": EOF" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.017759 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": EOF" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.017757 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": EOF" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.017849 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": EOF" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.083526 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.191613 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd39a7d-3499-4846-82b1-452bd627dd23" path="/var/lib/kubelet/pods/fdd39a7d-3499-4846-82b1-452bd627dd23/volumes" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.194500 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-db947f9b4-m6rgq" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.532041 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9dd466c4f-pgb9f" Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.626160 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b4f948f96-w6dtl"] Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.627067 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b4f948f96-w6dtl" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerName="neutron-api" containerID="cri-o://4824a062051e476f736f433329611d5178f7dfd6175515388a95609127d1e54b" gracePeriod=30 Dec 10 12:14:12 crc kubenswrapper[4852]: I1210 12:14:12.627663 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b4f948f96-w6dtl" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerName="neutron-httpd" containerID="cri-o://a8f773664888bdd65d6b174dc3cf8be92750d95f5151acf68f3f4f2cf0653eaa" gracePeriod=30 Dec 10 12:14:13 crc kubenswrapper[4852]: I1210 12:14:13.021573 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7","Type":"ContainerStarted","Data":"74e4ba32144c0f58927e596841258dad73b8af5eb4940c88db318cdcda6c9a03"} Dec 10 12:14:13 crc kubenswrapper[4852]: I1210 12:14:13.038956 4852 generic.go:334] "Generic (PLEG): container finished" podID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerID="26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d" exitCode=143 Dec 10 12:14:13 crc kubenswrapper[4852]: I1210 12:14:13.039081 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbcc6b494-qbdnt" event={"ID":"7e7a6d2e-da38-4efd-b572-0fb131ccae60","Type":"ContainerDied","Data":"26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d"} Dec 10 12:14:13 crc kubenswrapper[4852]: I1210 12:14:13.046778 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerStarted","Data":"9c178ca548780cf488018059f1b3248d6fa119166e911d4b6133f6270d03c18b"} Dec 10 12:14:13 crc kubenswrapper[4852]: I1210 12:14:13.052443 4852 generic.go:334] "Generic (PLEG): container finished" podID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerID="a8f773664888bdd65d6b174dc3cf8be92750d95f5151acf68f3f4f2cf0653eaa" exitCode=0 Dec 10 12:14:13 crc kubenswrapper[4852]: I1210 12:14:13.053514 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f948f96-w6dtl" event={"ID":"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4","Type":"ContainerDied","Data":"a8f773664888bdd65d6b174dc3cf8be92750d95f5151acf68f3f4f2cf0653eaa"} Dec 10 12:14:13 crc kubenswrapper[4852]: I1210 12:14:13.073801 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.598877036 podStartE2EDuration="12.073776655s" podCreationTimestamp="2025-12-10 12:14:01 +0000 UTC" firstStartedPulling="2025-12-10 12:14:03.095512557 +0000 UTC m=+1329.181037781" lastFinishedPulling="2025-12-10 12:14:09.570412176 +0000 UTC m=+1335.655937400" observedRunningTime="2025-12-10 12:14:13.051219371 +0000 UTC m=+1339.136744605" watchObservedRunningTime="2025-12-10 12:14:13.073776655 +0000 UTC m=+1339.159301889" Dec 10 12:14:14 crc kubenswrapper[4852]: I1210 12:14:14.063013 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerStarted","Data":"d844717579df10010211d242c626820e5e63a7d13e3d6820da20344affbcb357"} Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.088368 4852 generic.go:334] "Generic (PLEG): container finished" podID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerID="4824a062051e476f736f433329611d5178f7dfd6175515388a95609127d1e54b" exitCode=0 Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.088709 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f948f96-w6dtl" event={"ID":"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4","Type":"ContainerDied","Data":"4824a062051e476f736f433329611d5178f7dfd6175515388a95609127d1e54b"} Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.110431 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerStarted","Data":"f530d362208c1b6a46901913496b83ba4b18795421bf22702a2a86d5fc142262"} Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.295006 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.362280 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-httpd-config\") pod \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.362339 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp86j\" (UniqueName: \"kubernetes.io/projected/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-kube-api-access-hp86j\") pod \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.362367 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-ovndb-tls-certs\") pod \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.362396 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-combined-ca-bundle\") pod \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.362449 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-config\") pod \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\" (UID: \"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4\") " Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.367110 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" (UID: "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.376458 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-kube-api-access-hp86j" (OuterVolumeSpecName: "kube-api-access-hp86j") pod "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" (UID: "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4"). InnerVolumeSpecName "kube-api-access-hp86j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.442379 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-config" (OuterVolumeSpecName: "config") pod "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" (UID: "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.442714 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" (UID: "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.464345 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp86j\" (UniqueName: \"kubernetes.io/projected/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-kube-api-access-hp86j\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.464384 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.464400 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.464412 4852 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.468617 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" (UID: "1348b87e-f303-4f39-9cf9-aa55ca6b0fd4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:15 crc kubenswrapper[4852]: I1210 12:14:15.566068 4852 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:16 crc kubenswrapper[4852]: I1210 12:14:16.120462 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f948f96-w6dtl" event={"ID":"1348b87e-f303-4f39-9cf9-aa55ca6b0fd4","Type":"ContainerDied","Data":"8be44bdf0542fe2fc1928a7f9def21427ba9e09d95adf4f6dfcf573522c5dabc"} Dec 10 12:14:16 crc kubenswrapper[4852]: I1210 12:14:16.120540 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4f948f96-w6dtl" Dec 10 12:14:16 crc kubenswrapper[4852]: I1210 12:14:16.120716 4852 scope.go:117] "RemoveContainer" containerID="a8f773664888bdd65d6b174dc3cf8be92750d95f5151acf68f3f4f2cf0653eaa" Dec 10 12:14:16 crc kubenswrapper[4852]: I1210 12:14:16.143016 4852 scope.go:117] "RemoveContainer" containerID="4824a062051e476f736f433329611d5178f7dfd6175515388a95609127d1e54b" Dec 10 12:14:16 crc kubenswrapper[4852]: I1210 12:14:16.161123 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b4f948f96-w6dtl"] Dec 10 12:14:16 crc kubenswrapper[4852]: I1210 12:14:16.197359 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b4f948f96-w6dtl"] Dec 10 12:14:17 crc kubenswrapper[4852]: I1210 12:14:17.131823 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerStarted","Data":"7afb19b742112b9cb7a90b9478dd12fed8cc3a169d6d00fb3389c79388e8bb0e"} Dec 10 12:14:17 crc kubenswrapper[4852]: I1210 12:14:17.132391 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 10 12:14:17 crc kubenswrapper[4852]: I1210 12:14:17.155833 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.05910818 podStartE2EDuration="8.155816302s" podCreationTimestamp="2025-12-10 12:14:09 +0000 UTC" firstStartedPulling="2025-12-10 12:14:11.26903547 +0000 UTC m=+1337.354560694" lastFinishedPulling="2025-12-10 12:14:16.365743592 +0000 UTC m=+1342.451268816" observedRunningTime="2025-12-10 12:14:17.15013187 +0000 UTC m=+1343.235657094" watchObservedRunningTime="2025-12-10 12:14:17.155816302 +0000 UTC m=+1343.241341526" Dec 10 12:14:17 crc kubenswrapper[4852]: I1210 12:14:17.345708 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 10 12:14:17 crc kubenswrapper[4852]: I1210 12:14:17.635289 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 10 12:14:17 crc kubenswrapper[4852]: I1210 12:14:17.765304 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:14:17 crc kubenswrapper[4852]: I1210 12:14:17.822388 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-68cl7"] Dec 10 12:14:17 crc kubenswrapper[4852]: I1210 12:14:17.822699 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" podUID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" containerName="dnsmasq-dns" containerID="cri-o://4f1b8e3ad9482d05b0a4032c5faaa7a0537cf87c94ebcb7ccf4294ee6cc4c8da" gracePeriod=10 Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.191550 4852 generic.go:334] "Generic (PLEG): container finished" podID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" containerID="4f1b8e3ad9482d05b0a4032c5faaa7a0537cf87c94ebcb7ccf4294ee6cc4c8da" exitCode=0 Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.198817 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" path="/var/lib/kubelet/pods/1348b87e-f303-4f39-9cf9-aa55ca6b0fd4/volumes" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.199730 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" event={"ID":"2bb0fcab-02ae-40d7-acbf-8976c76d312a","Type":"ContainerDied","Data":"4f1b8e3ad9482d05b0a4032c5faaa7a0537cf87c94ebcb7ccf4294ee6cc4c8da"} Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.268932 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.467911 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:37354->10.217.0.160:9311: read: connection reset by peer" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.467984 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5bbcc6b494-qbdnt" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:37344->10.217.0.160:9311: read: connection reset by peer" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.662160 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.837624 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-sb\") pod \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.838001 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-nb\") pod \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.838145 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-swift-storage-0\") pod \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.838184 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q95v\" (UniqueName: \"kubernetes.io/projected/2bb0fcab-02ae-40d7-acbf-8976c76d312a-kube-api-access-7q95v\") pod \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.838358 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-svc\") pod \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.838472 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-config\") pod \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\" (UID: \"2bb0fcab-02ae-40d7-acbf-8976c76d312a\") " Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.847465 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb0fcab-02ae-40d7-acbf-8976c76d312a-kube-api-access-7q95v" (OuterVolumeSpecName: "kube-api-access-7q95v") pod "2bb0fcab-02ae-40d7-acbf-8976c76d312a" (UID: "2bb0fcab-02ae-40d7-acbf-8976c76d312a"). InnerVolumeSpecName "kube-api-access-7q95v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.916916 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-config" (OuterVolumeSpecName: "config") pod "2bb0fcab-02ae-40d7-acbf-8976c76d312a" (UID: "2bb0fcab-02ae-40d7-acbf-8976c76d312a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.927756 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bb0fcab-02ae-40d7-acbf-8976c76d312a" (UID: "2bb0fcab-02ae-40d7-acbf-8976c76d312a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.940245 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bb0fcab-02ae-40d7-acbf-8976c76d312a" (UID: "2bb0fcab-02ae-40d7-acbf-8976c76d312a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.943978 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.944002 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q95v\" (UniqueName: \"kubernetes.io/projected/2bb0fcab-02ae-40d7-acbf-8976c76d312a-kube-api-access-7q95v\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.944013 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.944022 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.950840 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2bb0fcab-02ae-40d7-acbf-8976c76d312a" (UID: "2bb0fcab-02ae-40d7-acbf-8976c76d312a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:18 crc kubenswrapper[4852]: I1210 12:14:18.970118 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2bb0fcab-02ae-40d7-acbf-8976c76d312a" (UID: "2bb0fcab-02ae-40d7-acbf-8976c76d312a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.042345 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.045528 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.045559 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bb0fcab-02ae-40d7-acbf-8976c76d312a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.146865 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e7a6d2e-da38-4efd-b572-0fb131ccae60-logs\") pod \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.146944 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data\") pod \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.147063 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-combined-ca-bundle\") pod \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.147122 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz9lt\" (UniqueName: \"kubernetes.io/projected/7e7a6d2e-da38-4efd-b572-0fb131ccae60-kube-api-access-wz9lt\") pod \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.147177 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data-custom\") pod \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\" (UID: \"7e7a6d2e-da38-4efd-b572-0fb131ccae60\") " Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.147294 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e7a6d2e-da38-4efd-b572-0fb131ccae60-logs" (OuterVolumeSpecName: "logs") pod "7e7a6d2e-da38-4efd-b572-0fb131ccae60" (UID: "7e7a6d2e-da38-4efd-b572-0fb131ccae60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.147600 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e7a6d2e-da38-4efd-b572-0fb131ccae60-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.164418 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7e7a6d2e-da38-4efd-b572-0fb131ccae60" (UID: "7e7a6d2e-da38-4efd-b572-0fb131ccae60"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.164487 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7a6d2e-da38-4efd-b572-0fb131ccae60-kube-api-access-wz9lt" (OuterVolumeSpecName: "kube-api-access-wz9lt") pod "7e7a6d2e-da38-4efd-b572-0fb131ccae60" (UID: "7e7a6d2e-da38-4efd-b572-0fb131ccae60"). InnerVolumeSpecName "kube-api-access-wz9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.179480 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e7a6d2e-da38-4efd-b572-0fb131ccae60" (UID: "7e7a6d2e-da38-4efd-b572-0fb131ccae60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.203005 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data" (OuterVolumeSpecName: "config-data") pod "7e7a6d2e-da38-4efd-b572-0fb131ccae60" (UID: "7e7a6d2e-da38-4efd-b572-0fb131ccae60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.210500 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" event={"ID":"2bb0fcab-02ae-40d7-acbf-8976c76d312a","Type":"ContainerDied","Data":"338a0b3eaea39f9cad45cbb043badacff745a90d92463685d21cd7df34a23606"} Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.210550 4852 scope.go:117] "RemoveContainer" containerID="4f1b8e3ad9482d05b0a4032c5faaa7a0537cf87c94ebcb7ccf4294ee6cc4c8da" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.210584 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-68cl7" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.213939 4852 generic.go:334] "Generic (PLEG): container finished" podID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerID="d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b" exitCode=0 Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.214025 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbcc6b494-qbdnt" event={"ID":"7e7a6d2e-da38-4efd-b572-0fb131ccae60","Type":"ContainerDied","Data":"d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b"} Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.214082 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bbcc6b494-qbdnt" event={"ID":"7e7a6d2e-da38-4efd-b572-0fb131ccae60","Type":"ContainerDied","Data":"52ff8b5fa3b875b80245e2d1543c46bc4faf045d111c2d24d960f744faed3179"} Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.214151 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bbcc6b494-qbdnt" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.214218 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerName="probe" containerID="cri-o://74e4ba32144c0f58927e596841258dad73b8af5eb4940c88db318cdcda6c9a03" gracePeriod=30 Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.214167 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerName="cinder-scheduler" containerID="cri-o://3715a632bbe77e0d803a2f52e91d6055d3988b3eef335a81e3cd67b1126dd9e3" gracePeriod=30 Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.259461 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz9lt\" (UniqueName: \"kubernetes.io/projected/7e7a6d2e-da38-4efd-b572-0fb131ccae60-kube-api-access-wz9lt\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.259497 4852 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.260099 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.260144 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e7a6d2e-da38-4efd-b572-0fb131ccae60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.281400 4852 scope.go:117] "RemoveContainer" containerID="7208f430134cd22178b78f5ff83b54b2784c642f6277cb9dd9ec6b4375e67d46" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.296897 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bbcc6b494-qbdnt"] Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.337077 4852 scope.go:117] "RemoveContainer" containerID="d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.337345 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5bbcc6b494-qbdnt"] Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.349921 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-68cl7"] Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.357462 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-68cl7"] Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.379942 4852 scope.go:117] "RemoveContainer" containerID="26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.404494 4852 scope.go:117] "RemoveContainer" containerID="d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b" Dec 10 12:14:19 crc kubenswrapper[4852]: E1210 12:14:19.405032 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b\": container with ID starting with d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b not found: ID does not exist" containerID="d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.405137 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b"} err="failed to get container status \"d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b\": rpc error: code = NotFound desc = could not find container \"d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b\": container with ID starting with d03f5b862f9544c0fe40a00fcf58d877d8eeba5074ba59dce95680a6ffe5f46b not found: ID does not exist" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.405215 4852 scope.go:117] "RemoveContainer" containerID="26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d" Dec 10 12:14:19 crc kubenswrapper[4852]: E1210 12:14:19.408107 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d\": container with ID starting with 26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d not found: ID does not exist" containerID="26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.408282 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d"} err="failed to get container status \"26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d\": rpc error: code = NotFound desc = could not find container \"26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d\": container with ID starting with 26825d9878ba93005fe535752341b4c95f001f4f294109daf4dac936f3356f9d not found: ID does not exist" Dec 10 12:14:19 crc kubenswrapper[4852]: I1210 12:14:19.969095 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7dd8c6757f-lbdxp" Dec 10 12:14:20 crc kubenswrapper[4852]: I1210 12:14:20.183732 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" path="/var/lib/kubelet/pods/2bb0fcab-02ae-40d7-acbf-8976c76d312a/volumes" Dec 10 12:14:20 crc kubenswrapper[4852]: I1210 12:14:20.184881 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" path="/var/lib/kubelet/pods/7e7a6d2e-da38-4efd-b572-0fb131ccae60/volumes" Dec 10 12:14:20 crc kubenswrapper[4852]: I1210 12:14:20.423977 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.236516 4852 generic.go:334] "Generic (PLEG): container finished" podID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerID="74e4ba32144c0f58927e596841258dad73b8af5eb4940c88db318cdcda6c9a03" exitCode=0 Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.237200 4852 generic.go:334] "Generic (PLEG): container finished" podID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerID="3715a632bbe77e0d803a2f52e91d6055d3988b3eef335a81e3cd67b1126dd9e3" exitCode=0 Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.237308 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7","Type":"ContainerDied","Data":"74e4ba32144c0f58927e596841258dad73b8af5eb4940c88db318cdcda6c9a03"} Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.237392 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7","Type":"ContainerDied","Data":"3715a632bbe77e0d803a2f52e91d6055d3988b3eef335a81e3cd67b1126dd9e3"} Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.726354 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-955f9866d-84pn5" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.746406 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.807739 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data-custom\") pod \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.807795 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-etc-machine-id\") pod \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.807818 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-combined-ca-bundle\") pod \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.807880 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data\") pod \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.807925 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-scripts\") pod \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.807978 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp7bv\" (UniqueName: \"kubernetes.io/projected/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-kube-api-access-gp7bv\") pod \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\" (UID: \"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7\") " Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.809341 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" (UID: "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.828503 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-kube-api-access-gp7bv" (OuterVolumeSpecName: "kube-api-access-gp7bv") pod "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" (UID: "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7"). InnerVolumeSpecName "kube-api-access-gp7bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.836710 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-scripts" (OuterVolumeSpecName: "scripts") pod "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" (UID: "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.846400 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" (UID: "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.886333 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" (UID: "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.912386 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.912428 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp7bv\" (UniqueName: \"kubernetes.io/projected/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-kube-api-access-gp7bv\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.912442 4852 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.912455 4852 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.912467 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:21 crc kubenswrapper[4852]: I1210 12:14:21.950378 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data" (OuterVolumeSpecName: "config-data") pod "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" (UID: "5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.013729 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.249070 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7","Type":"ContainerDied","Data":"7b90bb51d036bb51d801098eca1af70f233c7d07875371063e9c8ec8fcebd9da"} Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.249115 4852 scope.go:117] "RemoveContainer" containerID="74e4ba32144c0f58927e596841258dad73b8af5eb4940c88db318cdcda6c9a03" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.249216 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.295601 4852 scope.go:117] "RemoveContainer" containerID="3715a632bbe77e0d803a2f52e91d6055d3988b3eef335a81e3cd67b1126dd9e3" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.295666 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.302209 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.309680 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 10 12:14:22 crc kubenswrapper[4852]: E1210 12:14:22.310099 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerName="cinder-scheduler" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310121 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerName="cinder-scheduler" Dec 10 12:14:22 crc kubenswrapper[4852]: E1210 12:14:22.310138 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" containerName="init" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310146 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" containerName="init" Dec 10 12:14:22 crc kubenswrapper[4852]: E1210 12:14:22.310166 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerName="neutron-httpd" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310173 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerName="neutron-httpd" Dec 10 12:14:22 crc kubenswrapper[4852]: E1210 12:14:22.310187 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerName="neutron-api" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310192 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerName="neutron-api" Dec 10 12:14:22 crc kubenswrapper[4852]: E1210 12:14:22.310201 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api-log" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310207 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api-log" Dec 10 12:14:22 crc kubenswrapper[4852]: E1210 12:14:22.310220 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerName="probe" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310228 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerName="probe" Dec 10 12:14:22 crc kubenswrapper[4852]: E1210 12:14:22.310257 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310264 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api" Dec 10 12:14:22 crc kubenswrapper[4852]: E1210 12:14:22.310285 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" containerName="dnsmasq-dns" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310290 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" containerName="dnsmasq-dns" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310481 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerName="neutron-api" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310501 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerName="probe" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310510 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" containerName="cinder-scheduler" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310530 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api-log" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310542 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="1348b87e-f303-4f39-9cf9-aa55ca6b0fd4" containerName="neutron-httpd" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310552 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7a6d2e-da38-4efd-b572-0fb131ccae60" containerName="barbican-api" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.310567 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb0fcab-02ae-40d7-acbf-8976c76d312a" containerName="dnsmasq-dns" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.311417 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.317804 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-j6dd8" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.318419 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.318546 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.320527 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.322330 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.354847 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.358341 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.362747 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.421425 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12600c57-0ba3-4781-93cc-317e533e52d8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.421859 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12600c57-0ba3-4781-93cc-317e533e52d8-openstack-config\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.421887 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.421914 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5cw5\" (UniqueName: \"kubernetes.io/projected/ff8bb370-489b-402e-a532-8dc299fa3aee-kube-api-access-s5cw5\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.421941 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff8bb370-489b-402e-a532-8dc299fa3aee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.422017 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-scripts\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.422049 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trmg7\" (UniqueName: \"kubernetes.io/projected/12600c57-0ba3-4781-93cc-317e533e52d8-kube-api-access-trmg7\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.422083 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.422127 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-config-data\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.422270 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12600c57-0ba3-4781-93cc-317e533e52d8-openstack-config-secret\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523637 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523725 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-config-data\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523753 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12600c57-0ba3-4781-93cc-317e533e52d8-openstack-config-secret\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523794 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12600c57-0ba3-4781-93cc-317e533e52d8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523834 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12600c57-0ba3-4781-93cc-317e533e52d8-openstack-config\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523855 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523875 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5cw5\" (UniqueName: \"kubernetes.io/projected/ff8bb370-489b-402e-a532-8dc299fa3aee-kube-api-access-s5cw5\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523892 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff8bb370-489b-402e-a532-8dc299fa3aee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523962 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-scripts\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.523997 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trmg7\" (UniqueName: \"kubernetes.io/projected/12600c57-0ba3-4781-93cc-317e533e52d8-kube-api-access-trmg7\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.524292 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ff8bb370-489b-402e-a532-8dc299fa3aee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.525224 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12600c57-0ba3-4781-93cc-317e533e52d8-openstack-config\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.529958 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12600c57-0ba3-4781-93cc-317e533e52d8-openstack-config-secret\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.530618 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.530740 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.531224 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12600c57-0ba3-4781-93cc-317e533e52d8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.531395 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-config-data\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.533569 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8bb370-489b-402e-a532-8dc299fa3aee-scripts\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.542030 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trmg7\" (UniqueName: \"kubernetes.io/projected/12600c57-0ba3-4781-93cc-317e533e52d8-kube-api-access-trmg7\") pod \"openstackclient\" (UID: \"12600c57-0ba3-4781-93cc-317e533e52d8\") " pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.542747 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5cw5\" (UniqueName: \"kubernetes.io/projected/ff8bb370-489b-402e-a532-8dc299fa3aee-kube-api-access-s5cw5\") pod \"cinder-scheduler-0\" (UID: \"ff8bb370-489b-402e-a532-8dc299fa3aee\") " pod="openstack/cinder-scheduler-0" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.672654 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 10 12:14:22 crc kubenswrapper[4852]: I1210 12:14:22.680670 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 10 12:14:23 crc kubenswrapper[4852]: I1210 12:14:23.201172 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 10 12:14:23 crc kubenswrapper[4852]: W1210 12:14:23.203006 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12600c57_0ba3_4781_93cc_317e533e52d8.slice/crio-f426bffd4c0f53364441ab64938eb1b9b0f4709055fbd47f0f55d3fb77dbb5f2 WatchSource:0}: Error finding container f426bffd4c0f53364441ab64938eb1b9b0f4709055fbd47f0f55d3fb77dbb5f2: Status 404 returned error can't find the container with id f426bffd4c0f53364441ab64938eb1b9b0f4709055fbd47f0f55d3fb77dbb5f2 Dec 10 12:14:23 crc kubenswrapper[4852]: I1210 12:14:23.258784 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"12600c57-0ba3-4781-93cc-317e533e52d8","Type":"ContainerStarted","Data":"f426bffd4c0f53364441ab64938eb1b9b0f4709055fbd47f0f55d3fb77dbb5f2"} Dec 10 12:14:23 crc kubenswrapper[4852]: I1210 12:14:23.287926 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 10 12:14:23 crc kubenswrapper[4852]: W1210 12:14:23.288950 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8bb370_489b_402e_a532_8dc299fa3aee.slice/crio-5c7385bc9b787ae43836a0bc8af691aba3f27c50db469a1b7f953a77e2c34e73 WatchSource:0}: Error finding container 5c7385bc9b787ae43836a0bc8af691aba3f27c50db469a1b7f953a77e2c34e73: Status 404 returned error can't find the container with id 5c7385bc9b787ae43836a0bc8af691aba3f27c50db469a1b7f953a77e2c34e73 Dec 10 12:14:24 crc kubenswrapper[4852]: I1210 12:14:24.186100 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7" path="/var/lib/kubelet/pods/5a125ae3-4463-42fa-a09c-c1b2cf8f5cb7/volumes" Dec 10 12:14:24 crc kubenswrapper[4852]: I1210 12:14:24.273126 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ff8bb370-489b-402e-a532-8dc299fa3aee","Type":"ContainerStarted","Data":"e25df53e0c4680076862303a42b46402a4ef460620e7251ecee7d93b732e2e75"} Dec 10 12:14:24 crc kubenswrapper[4852]: I1210 12:14:24.273171 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ff8bb370-489b-402e-a532-8dc299fa3aee","Type":"ContainerStarted","Data":"5c7385bc9b787ae43836a0bc8af691aba3f27c50db469a1b7f953a77e2c34e73"} Dec 10 12:14:25 crc kubenswrapper[4852]: I1210 12:14:25.285740 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ff8bb370-489b-402e-a532-8dc299fa3aee","Type":"ContainerStarted","Data":"b096d904a4a15080f6c928f7c6efb62c12b4dd0d18d967c01b3971003fcb674b"} Dec 10 12:14:25 crc kubenswrapper[4852]: I1210 12:14:25.306883 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.306864513 podStartE2EDuration="3.306864513s" podCreationTimestamp="2025-12-10 12:14:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:25.304430772 +0000 UTC m=+1351.389955996" watchObservedRunningTime="2025-12-10 12:14:25.306864513 +0000 UTC m=+1351.392389737" Dec 10 12:14:27 crc kubenswrapper[4852]: I1210 12:14:27.680904 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 10 12:14:27 crc kubenswrapper[4852]: I1210 12:14:27.983696 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:14:27 crc kubenswrapper[4852]: I1210 12:14:27.984194 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerName="glance-log" containerID="cri-o://1272a9cac8b7bc11125b0e8689ad82ac95352cff64a81c37270dc33c31e0e7a4" gracePeriod=30 Dec 10 12:14:27 crc kubenswrapper[4852]: I1210 12:14:27.984335 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerName="glance-httpd" containerID="cri-o://ee39b662ae77cbfd9718a2674a3f69996c0ca865953d7e740f76e47c7b9d6aed" gracePeriod=30 Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.317950 4852 generic.go:334] "Generic (PLEG): container finished" podID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerID="1272a9cac8b7bc11125b0e8689ad82ac95352cff64a81c37270dc33c31e0e7a4" exitCode=143 Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.317989 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65bcb5c-dfe8-4412-8fbf-7e717ab28750","Type":"ContainerDied","Data":"1272a9cac8b7bc11125b0e8689ad82ac95352cff64a81c37270dc33c31e0e7a4"} Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.922658 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-8698bf8cd7-bmf4z"] Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.924326 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.926560 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.926622 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.926839 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.938343 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-8698bf8cd7-bmf4z"] Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.956845 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-internal-tls-certs\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.957197 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxqtl\" (UniqueName: \"kubernetes.io/projected/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-kube-api-access-rxqtl\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.957250 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-public-tls-certs\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.957310 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-config-data\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.957332 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-combined-ca-bundle\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.957479 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-log-httpd\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.957525 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-run-httpd\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:28 crc kubenswrapper[4852]: I1210 12:14:28.957570 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-etc-swift\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058325 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-log-httpd\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058378 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-run-httpd\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058402 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-etc-swift\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058487 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-internal-tls-certs\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058522 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxqtl\" (UniqueName: \"kubernetes.io/projected/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-kube-api-access-rxqtl\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058546 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-public-tls-certs\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058576 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-config-data\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058596 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-combined-ca-bundle\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.058919 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-log-httpd\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.059499 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-run-httpd\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.064889 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-public-tls-certs\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.065950 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-combined-ca-bundle\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.074045 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-internal-tls-certs\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.074352 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-config-data\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.074507 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-etc-swift\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.081484 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxqtl\" (UniqueName: \"kubernetes.io/projected/a41546b5-9dd3-4400-97ba-4bf433dc2c2c-kube-api-access-rxqtl\") pod \"swift-proxy-8698bf8cd7-bmf4z\" (UID: \"a41546b5-9dd3-4400-97ba-4bf433dc2c2c\") " pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:29 crc kubenswrapper[4852]: I1210 12:14:29.251314 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.360225 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.360500 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerName="glance-log" containerID="cri-o://0fc8af81b80689c6cd891634190a7237468e2d4b4c5a654a4d08da0af40a9c8b" gracePeriod=30 Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.360636 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerName="glance-httpd" containerID="cri-o://193f80cb3a9f9f3ddc44ffdd85d5ec927cc733119a9e5aca24aa455929ade979" gracePeriod=30 Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.612848 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.613594 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="ceilometer-central-agent" containerID="cri-o://9c178ca548780cf488018059f1b3248d6fa119166e911d4b6133f6270d03c18b" gracePeriod=30 Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.613652 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="sg-core" containerID="cri-o://f530d362208c1b6a46901913496b83ba4b18795421bf22702a2a86d5fc142262" gracePeriod=30 Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.613785 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="ceilometer-notification-agent" containerID="cri-o://d844717579df10010211d242c626820e5e63a7d13e3d6820da20344affbcb357" gracePeriod=30 Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.613818 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="proxy-httpd" containerID="cri-o://7afb19b742112b9cb7a90b9478dd12fed8cc3a169d6d00fb3389c79388e8bb0e" gracePeriod=30 Dec 10 12:14:30 crc kubenswrapper[4852]: I1210 12:14:30.632957 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.343983 4852 generic.go:334] "Generic (PLEG): container finished" podID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerID="0fc8af81b80689c6cd891634190a7237468e2d4b4c5a654a4d08da0af40a9c8b" exitCode=143 Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.344053 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e410ffd3-2d2c-4665-95fd-e20c287c3151","Type":"ContainerDied","Data":"0fc8af81b80689c6cd891634190a7237468e2d4b4c5a654a4d08da0af40a9c8b"} Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.347559 4852 generic.go:334] "Generic (PLEG): container finished" podID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerID="7afb19b742112b9cb7a90b9478dd12fed8cc3a169d6d00fb3389c79388e8bb0e" exitCode=0 Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.347588 4852 generic.go:334] "Generic (PLEG): container finished" podID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerID="f530d362208c1b6a46901913496b83ba4b18795421bf22702a2a86d5fc142262" exitCode=2 Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.347596 4852 generic.go:334] "Generic (PLEG): container finished" podID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerID="9c178ca548780cf488018059f1b3248d6fa119166e911d4b6133f6270d03c18b" exitCode=0 Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.347635 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerDied","Data":"7afb19b742112b9cb7a90b9478dd12fed8cc3a169d6d00fb3389c79388e8bb0e"} Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.347661 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerDied","Data":"f530d362208c1b6a46901913496b83ba4b18795421bf22702a2a86d5fc142262"} Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.347671 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerDied","Data":"9c178ca548780cf488018059f1b3248d6fa119166e911d4b6133f6270d03c18b"} Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.349758 4852 generic.go:334] "Generic (PLEG): container finished" podID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerID="ee39b662ae77cbfd9718a2674a3f69996c0ca865953d7e740f76e47c7b9d6aed" exitCode=0 Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.349796 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65bcb5c-dfe8-4412-8fbf-7e717ab28750","Type":"ContainerDied","Data":"ee39b662ae77cbfd9718a2674a3f69996c0ca865953d7e740f76e47c7b9d6aed"} Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.725988 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-955f9866d-84pn5" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 10 12:14:31 crc kubenswrapper[4852]: I1210 12:14:31.726108 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:14:32 crc kubenswrapper[4852]: I1210 12:14:32.365533 4852 generic.go:334] "Generic (PLEG): container finished" podID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerID="d844717579df10010211d242c626820e5e63a7d13e3d6820da20344affbcb357" exitCode=0 Dec 10 12:14:32 crc kubenswrapper[4852]: I1210 12:14:32.365723 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerDied","Data":"d844717579df10010211d242c626820e5e63a7d13e3d6820da20344affbcb357"} Dec 10 12:14:32 crc kubenswrapper[4852]: I1210 12:14:32.902808 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 10 12:14:34 crc kubenswrapper[4852]: I1210 12:14:34.386258 4852 generic.go:334] "Generic (PLEG): container finished" podID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerID="193f80cb3a9f9f3ddc44ffdd85d5ec927cc733119a9e5aca24aa455929ade979" exitCode=0 Dec 10 12:14:34 crc kubenswrapper[4852]: I1210 12:14:34.386573 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e410ffd3-2d2c-4665-95fd-e20c287c3151","Type":"ContainerDied","Data":"193f80cb3a9f9f3ddc44ffdd85d5ec927cc733119a9e5aca24aa455929ade979"} Dec 10 12:14:35 crc kubenswrapper[4852]: I1210 12:14:35.732410 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-8698bf8cd7-bmf4z"] Dec 10 12:14:35 crc kubenswrapper[4852]: I1210 12:14:35.818153 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:35 crc kubenswrapper[4852]: I1210 12:14:35.900951 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-sg-core-conf-yaml\") pod \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " Dec 10 12:14:35 crc kubenswrapper[4852]: I1210 12:14:35.901048 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqp4j\" (UniqueName: \"kubernetes.io/projected/0dead70d-b3b6-4bee-a6cf-5e10924a3877-kube-api-access-mqp4j\") pod \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " Dec 10 12:14:35 crc kubenswrapper[4852]: I1210 12:14:35.919494 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dead70d-b3b6-4bee-a6cf-5e10924a3877-kube-api-access-mqp4j" (OuterVolumeSpecName: "kube-api-access-mqp4j") pod "0dead70d-b3b6-4bee-a6cf-5e10924a3877" (UID: "0dead70d-b3b6-4bee-a6cf-5e10924a3877"). InnerVolumeSpecName "kube-api-access-mqp4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:35 crc kubenswrapper[4852]: I1210 12:14:35.922324 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:14:35 crc kubenswrapper[4852]: I1210 12:14:35.946016 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0dead70d-b3b6-4bee-a6cf-5e10924a3877" (UID: "0dead70d-b3b6-4bee-a6cf-5e10924a3877"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.002580 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-run-httpd\") pod \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.002639 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-log-httpd\") pod \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.002675 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-scripts\") pod \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.002730 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-config-data\") pod \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.002967 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-combined-ca-bundle\") pod \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\" (UID: \"0dead70d-b3b6-4bee-a6cf-5e10924a3877\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.003415 4852 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.003442 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqp4j\" (UniqueName: \"kubernetes.io/projected/0dead70d-b3b6-4bee-a6cf-5e10924a3877-kube-api-access-mqp4j\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.003460 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0dead70d-b3b6-4bee-a6cf-5e10924a3877" (UID: "0dead70d-b3b6-4bee-a6cf-5e10924a3877"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.003642 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0dead70d-b3b6-4bee-a6cf-5e10924a3877" (UID: "0dead70d-b3b6-4bee-a6cf-5e10924a3877"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.007670 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-scripts" (OuterVolumeSpecName: "scripts") pod "0dead70d-b3b6-4bee-a6cf-5e10924a3877" (UID: "0dead70d-b3b6-4bee-a6cf-5e10924a3877"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.089795 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dead70d-b3b6-4bee-a6cf-5e10924a3877" (UID: "0dead70d-b3b6-4bee-a6cf-5e10924a3877"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.101593 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-config-data" (OuterVolumeSpecName: "config-data") pod "0dead70d-b3b6-4bee-a6cf-5e10924a3877" (UID: "0dead70d-b3b6-4bee-a6cf-5e10924a3877"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105350 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjpj5\" (UniqueName: \"kubernetes.io/projected/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-kube-api-access-bjpj5\") pod \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105401 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-public-tls-certs\") pod \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105463 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-combined-ca-bundle\") pod \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105513 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-httpd-run\") pod \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105543 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-scripts\") pod \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105565 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-config-data\") pod \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105591 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105610 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-logs\") pod \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\" (UID: \"b65bcb5c-dfe8-4412-8fbf-7e717ab28750\") " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105925 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105947 4852 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105958 4852 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0dead70d-b3b6-4bee-a6cf-5e10924a3877-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105970 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.105980 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dead70d-b3b6-4bee-a6cf-5e10924a3877-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.106509 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-logs" (OuterVolumeSpecName: "logs") pod "b65bcb5c-dfe8-4412-8fbf-7e717ab28750" (UID: "b65bcb5c-dfe8-4412-8fbf-7e717ab28750"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.107008 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b65bcb5c-dfe8-4412-8fbf-7e717ab28750" (UID: "b65bcb5c-dfe8-4412-8fbf-7e717ab28750"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.115149 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-kube-api-access-bjpj5" (OuterVolumeSpecName: "kube-api-access-bjpj5") pod "b65bcb5c-dfe8-4412-8fbf-7e717ab28750" (UID: "b65bcb5c-dfe8-4412-8fbf-7e717ab28750"). InnerVolumeSpecName "kube-api-access-bjpj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.117349 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-scripts" (OuterVolumeSpecName: "scripts") pod "b65bcb5c-dfe8-4412-8fbf-7e717ab28750" (UID: "b65bcb5c-dfe8-4412-8fbf-7e717ab28750"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.117526 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "b65bcb5c-dfe8-4412-8fbf-7e717ab28750" (UID: "b65bcb5c-dfe8-4412-8fbf-7e717ab28750"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.154223 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b65bcb5c-dfe8-4412-8fbf-7e717ab28750" (UID: "b65bcb5c-dfe8-4412-8fbf-7e717ab28750"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.202356 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b65bcb5c-dfe8-4412-8fbf-7e717ab28750" (UID: "b65bcb5c-dfe8-4412-8fbf-7e717ab28750"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.205112 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-config-data" (OuterVolumeSpecName: "config-data") pod "b65bcb5c-dfe8-4412-8fbf-7e717ab28750" (UID: "b65bcb5c-dfe8-4412-8fbf-7e717ab28750"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.208194 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjpj5\" (UniqueName: \"kubernetes.io/projected/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-kube-api-access-bjpj5\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.208353 4852 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.208370 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.208381 4852 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.208391 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.208402 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.208424 4852 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.208434 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65bcb5c-dfe8-4412-8fbf-7e717ab28750-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.233512 4852 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.309632 4852 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.404294 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0dead70d-b3b6-4bee-a6cf-5e10924a3877","Type":"ContainerDied","Data":"47d1459971d9f200a984e9fccd6aa1e9054430ab6e1abf9f2478f3d4058667d6"} Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.404350 4852 scope.go:117] "RemoveContainer" containerID="7afb19b742112b9cb7a90b9478dd12fed8cc3a169d6d00fb3389c79388e8bb0e" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.404525 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.409289 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65bcb5c-dfe8-4412-8fbf-7e717ab28750","Type":"ContainerDied","Data":"659d3f2e69352b280e0d66b788d9908372f2352e2a753e4c3755267ed352a956"} Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.409324 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.411038 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" event={"ID":"a41546b5-9dd3-4400-97ba-4bf433dc2c2c","Type":"ContainerStarted","Data":"ea3623660157b4c30909d22a32fd9fe55f8ad598061298ff23835323e514a1b7"} Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.411088 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" event={"ID":"a41546b5-9dd3-4400-97ba-4bf433dc2c2c","Type":"ContainerStarted","Data":"e61e72efd8dcf053f41183fe3630c28f207662c4d8023490359799b13ffd0787"} Dec 10 12:14:36 crc kubenswrapper[4852]: E1210 12:14:36.425796 4852 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dead70d_b3b6_4bee_a6cf_5e10924a3877.slice\": RecentStats: unable to find data in memory cache]" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.433331 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.467354 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.492412 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:36 crc kubenswrapper[4852]: E1210 12:14:36.492824 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="ceilometer-central-agent" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.492841 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="ceilometer-central-agent" Dec 10 12:14:36 crc kubenswrapper[4852]: E1210 12:14:36.492864 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerName="glance-httpd" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.492870 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerName="glance-httpd" Dec 10 12:14:36 crc kubenswrapper[4852]: E1210 12:14:36.492881 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="proxy-httpd" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.492888 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="proxy-httpd" Dec 10 12:14:36 crc kubenswrapper[4852]: E1210 12:14:36.492907 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="sg-core" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.492913 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="sg-core" Dec 10 12:14:36 crc kubenswrapper[4852]: E1210 12:14:36.492930 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerName="glance-log" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.492938 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerName="glance-log" Dec 10 12:14:36 crc kubenswrapper[4852]: E1210 12:14:36.492960 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="ceilometer-notification-agent" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.492966 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="ceilometer-notification-agent" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.493156 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="ceilometer-central-agent" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.493167 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="proxy-httpd" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.493183 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerName="glance-log" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.493194 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="ceilometer-notification-agent" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.493204 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" containerName="sg-core" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.493220 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" containerName="glance-httpd" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.494935 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.497795 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.497986 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.503345 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.514084 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.526359 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.540630 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.543022 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.546921 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.569495 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.597313 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.617970 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.618058 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.618104 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njjc7\" (UniqueName: \"kubernetes.io/projected/912732dc-e2fb-4acd-a463-69eb77ff5a6d-kube-api-access-njjc7\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.618126 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-config-data\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.618168 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-log-httpd\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.618209 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-run-httpd\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.618257 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-scripts\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720187 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720313 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720362 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720393 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rcf5\" (UniqueName: \"kubernetes.io/projected/0edbc55b-f57a-46c0-9991-33d794c74319-kube-api-access-7rcf5\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720424 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720456 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-config-data\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720500 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njjc7\" (UniqueName: \"kubernetes.io/projected/912732dc-e2fb-4acd-a463-69eb77ff5a6d-kube-api-access-njjc7\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720523 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-config-data\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720571 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-scripts\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720611 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-log-httpd\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720636 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0edbc55b-f57a-46c0-9991-33d794c74319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720657 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edbc55b-f57a-46c0-9991-33d794c74319-logs\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720691 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720724 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-run-httpd\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.720763 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-scripts\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.730547 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-scripts\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.730857 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-log-httpd\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.731063 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-run-httpd\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.761326 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njjc7\" (UniqueName: \"kubernetes.io/projected/912732dc-e2fb-4acd-a463-69eb77ff5a6d-kube-api-access-njjc7\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.763476 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-config-data\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.766953 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.773045 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.825550 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rcf5\" (UniqueName: \"kubernetes.io/projected/0edbc55b-f57a-46c0-9991-33d794c74319-kube-api-access-7rcf5\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.825625 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-config-data\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.825717 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-scripts\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.825746 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0edbc55b-f57a-46c0-9991-33d794c74319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.825777 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edbc55b-f57a-46c0-9991-33d794c74319-logs\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.825814 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.825880 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.825932 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.826201 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.826561 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0edbc55b-f57a-46c0-9991-33d794c74319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.835633 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.836032 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edbc55b-f57a-46c0-9991-33d794c74319-logs\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.836663 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-config-data\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.837599 4852 scope.go:117] "RemoveContainer" containerID="f530d362208c1b6a46901913496b83ba4b18795421bf22702a2a86d5fc142262" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.846058 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-scripts\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.849195 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.898704 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edbc55b-f57a-46c0-9991-33d794c74319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.901212 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kbz4r"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.902353 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.904652 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rcf5\" (UniqueName: \"kubernetes.io/projected/0edbc55b-f57a-46c0-9991-33d794c74319-kube-api-access-7rcf5\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.911169 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kbz4r"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.982091 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"0edbc55b-f57a-46c0-9991-33d794c74319\") " pod="openstack/glance-default-external-api-0" Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.997556 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qm45f"] Dec 10 12:14:36 crc kubenswrapper[4852]: I1210 12:14:36.998935 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.015862 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qm45f"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.019064 4852 scope.go:117] "RemoveContainer" containerID="d844717579df10010211d242c626820e5e63a7d13e3d6820da20344affbcb357" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.028911 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4df457a9-8045-4c39-abe3-31afc98aaa26-operator-scripts\") pod \"nova-api-db-create-kbz4r\" (UID: \"4df457a9-8045-4c39-abe3-31afc98aaa26\") " pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.028963 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz2t6\" (UniqueName: \"kubernetes.io/projected/4df457a9-8045-4c39-abe3-31afc98aaa26-kube-api-access-nz2t6\") pod \"nova-api-db-create-kbz4r\" (UID: \"4df457a9-8045-4c39-abe3-31afc98aaa26\") " pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.092625 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2f56-account-create-update-mhhlb"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.094011 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.107059 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.130840 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2f56-account-create-update-mhhlb"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.137994 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4df457a9-8045-4c39-abe3-31afc98aaa26-operator-scripts\") pod \"nova-api-db-create-kbz4r\" (UID: \"4df457a9-8045-4c39-abe3-31afc98aaa26\") " pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.140705 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4df457a9-8045-4c39-abe3-31afc98aaa26-operator-scripts\") pod \"nova-api-db-create-kbz4r\" (UID: \"4df457a9-8045-4c39-abe3-31afc98aaa26\") " pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.140855 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz2t6\" (UniqueName: \"kubernetes.io/projected/4df457a9-8045-4c39-abe3-31afc98aaa26-kube-api-access-nz2t6\") pod \"nova-api-db-create-kbz4r\" (UID: \"4df457a9-8045-4c39-abe3-31afc98aaa26\") " pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.140906 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-operator-scripts\") pod \"nova-cell0-db-create-qm45f\" (UID: \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\") " pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.141178 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnsb\" (UniqueName: \"kubernetes.io/projected/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-kube-api-access-rvnsb\") pod \"nova-cell0-db-create-qm45f\" (UID: \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\") " pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.146998 4852 scope.go:117] "RemoveContainer" containerID="9c178ca548780cf488018059f1b3248d6fa119166e911d4b6133f6270d03c18b" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.181955 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz2t6\" (UniqueName: \"kubernetes.io/projected/4df457a9-8045-4c39-abe3-31afc98aaa26-kube-api-access-nz2t6\") pod \"nova-api-db-create-kbz4r\" (UID: \"4df457a9-8045-4c39-abe3-31afc98aaa26\") " pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.189379 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mr6q4"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.191242 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.191688 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.220030 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mr6q4"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.229011 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.244386 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-operator-scripts\") pod \"nova-cell0-db-create-qm45f\" (UID: \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\") " pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.244488 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnsb\" (UniqueName: \"kubernetes.io/projected/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-kube-api-access-rvnsb\") pod \"nova-cell0-db-create-qm45f\" (UID: \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\") " pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.244544 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhhs4\" (UniqueName: \"kubernetes.io/projected/9c3f835c-31cf-4330-8951-9a1e5414b839-kube-api-access-xhhs4\") pod \"nova-api-2f56-account-create-update-mhhlb\" (UID: \"9c3f835c-31cf-4330-8951-9a1e5414b839\") " pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.244572 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3f835c-31cf-4330-8951-9a1e5414b839-operator-scripts\") pod \"nova-api-2f56-account-create-update-mhhlb\" (UID: \"9c3f835c-31cf-4330-8951-9a1e5414b839\") " pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.245982 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-operator-scripts\") pod \"nova-cell0-db-create-qm45f\" (UID: \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\") " pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.296659 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnsb\" (UniqueName: \"kubernetes.io/projected/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-kube-api-access-rvnsb\") pod \"nova-cell0-db-create-qm45f\" (UID: \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\") " pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.319163 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d986-account-create-update-hhxgf"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.336247 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.338338 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.354684 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/836ad386-beab-46db-b9a0-8c31ce0791ec-operator-scripts\") pod \"nova-cell1-db-create-mr6q4\" (UID: \"836ad386-beab-46db-b9a0-8c31ce0791ec\") " pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.355368 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.359555 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhhs4\" (UniqueName: \"kubernetes.io/projected/9c3f835c-31cf-4330-8951-9a1e5414b839-kube-api-access-xhhs4\") pod \"nova-api-2f56-account-create-update-mhhlb\" (UID: \"9c3f835c-31cf-4330-8951-9a1e5414b839\") " pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.359612 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3f835c-31cf-4330-8951-9a1e5414b839-operator-scripts\") pod \"nova-api-2f56-account-create-update-mhhlb\" (UID: \"9c3f835c-31cf-4330-8951-9a1e5414b839\") " pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.359673 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds4rb\" (UniqueName: \"kubernetes.io/projected/836ad386-beab-46db-b9a0-8c31ce0791ec-kube-api-access-ds4rb\") pod \"nova-cell1-db-create-mr6q4\" (UID: \"836ad386-beab-46db-b9a0-8c31ce0791ec\") " pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.362286 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3f835c-31cf-4330-8951-9a1e5414b839-operator-scripts\") pod \"nova-api-2f56-account-create-update-mhhlb\" (UID: \"9c3f835c-31cf-4330-8951-9a1e5414b839\") " pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.363512 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.372164 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d986-account-create-update-hhxgf"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.382060 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhhs4\" (UniqueName: \"kubernetes.io/projected/9c3f835c-31cf-4330-8951-9a1e5414b839-kube-api-access-xhhs4\") pod \"nova-api-2f56-account-create-update-mhhlb\" (UID: \"9c3f835c-31cf-4330-8951-9a1e5414b839\") " pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.385424 4852 scope.go:117] "RemoveContainer" containerID="ee39b662ae77cbfd9718a2674a3f69996c0ca865953d7e740f76e47c7b9d6aed" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.427221 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.444010 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e410ffd3-2d2c-4665-95fd-e20c287c3151","Type":"ContainerDied","Data":"cf1df9f9de452532550e75135f2ccbf612cf630a1fabd93430ab62d4aa53cd78"} Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.444118 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466190 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jhrh\" (UniqueName: \"kubernetes.io/projected/e410ffd3-2d2c-4665-95fd-e20c287c3151-kube-api-access-4jhrh\") pod \"e410ffd3-2d2c-4665-95fd-e20c287c3151\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466253 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-logs\") pod \"e410ffd3-2d2c-4665-95fd-e20c287c3151\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466281 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-combined-ca-bundle\") pod \"e410ffd3-2d2c-4665-95fd-e20c287c3151\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466377 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-httpd-run\") pod \"e410ffd3-2d2c-4665-95fd-e20c287c3151\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466417 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-config-data\") pod \"e410ffd3-2d2c-4665-95fd-e20c287c3151\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466511 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-scripts\") pod \"e410ffd3-2d2c-4665-95fd-e20c287c3151\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466542 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-internal-tls-certs\") pod \"e410ffd3-2d2c-4665-95fd-e20c287c3151\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466627 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e410ffd3-2d2c-4665-95fd-e20c287c3151\" (UID: \"e410ffd3-2d2c-4665-95fd-e20c287c3151\") " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.466987 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3c26f6c-540e-46cf-abb4-48905651f901-operator-scripts\") pod \"nova-cell0-d986-account-create-update-hhxgf\" (UID: \"b3c26f6c-540e-46cf-abb4-48905651f901\") " pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.467050 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds4rb\" (UniqueName: \"kubernetes.io/projected/836ad386-beab-46db-b9a0-8c31ce0791ec-kube-api-access-ds4rb\") pod \"nova-cell1-db-create-mr6q4\" (UID: \"836ad386-beab-46db-b9a0-8c31ce0791ec\") " pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.467129 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nltx8\" (UniqueName: \"kubernetes.io/projected/b3c26f6c-540e-46cf-abb4-48905651f901-kube-api-access-nltx8\") pod \"nova-cell0-d986-account-create-update-hhxgf\" (UID: \"b3c26f6c-540e-46cf-abb4-48905651f901\") " pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.467179 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/836ad386-beab-46db-b9a0-8c31ce0791ec-operator-scripts\") pod \"nova-cell1-db-create-mr6q4\" (UID: \"836ad386-beab-46db-b9a0-8c31ce0791ec\") " pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.467488 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e410ffd3-2d2c-4665-95fd-e20c287c3151" (UID: "e410ffd3-2d2c-4665-95fd-e20c287c3151"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.467718 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-logs" (OuterVolumeSpecName: "logs") pod "e410ffd3-2d2c-4665-95fd-e20c287c3151" (UID: "e410ffd3-2d2c-4665-95fd-e20c287c3151"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.468211 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/836ad386-beab-46db-b9a0-8c31ce0791ec-operator-scripts\") pod \"nova-cell1-db-create-mr6q4\" (UID: \"836ad386-beab-46db-b9a0-8c31ce0791ec\") " pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.481790 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-scripts" (OuterVolumeSpecName: "scripts") pod "e410ffd3-2d2c-4665-95fd-e20c287c3151" (UID: "e410ffd3-2d2c-4665-95fd-e20c287c3151"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.491519 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e410ffd3-2d2c-4665-95fd-e20c287c3151-kube-api-access-4jhrh" (OuterVolumeSpecName: "kube-api-access-4jhrh") pod "e410ffd3-2d2c-4665-95fd-e20c287c3151" (UID: "e410ffd3-2d2c-4665-95fd-e20c287c3151"). InnerVolumeSpecName "kube-api-access-4jhrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.491793 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "e410ffd3-2d2c-4665-95fd-e20c287c3151" (UID: "e410ffd3-2d2c-4665-95fd-e20c287c3151"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.504671 4852 scope.go:117] "RemoveContainer" containerID="1272a9cac8b7bc11125b0e8689ad82ac95352cff64a81c37270dc33c31e0e7a4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.508812 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2c6a-account-create-update-tnhvt"] Dec 10 12:14:37 crc kubenswrapper[4852]: E1210 12:14:37.509510 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerName="glance-log" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.509532 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerName="glance-log" Dec 10 12:14:37 crc kubenswrapper[4852]: E1210 12:14:37.509553 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerName="glance-httpd" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.509561 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerName="glance-httpd" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.509904 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerName="glance-log" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.509927 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" containerName="glance-httpd" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.510627 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.512743 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.518170 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e410ffd3-2d2c-4665-95fd-e20c287c3151" (UID: "e410ffd3-2d2c-4665-95fd-e20c287c3151"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.519772 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds4rb\" (UniqueName: \"kubernetes.io/projected/836ad386-beab-46db-b9a0-8c31ce0791ec-kube-api-access-ds4rb\") pod \"nova-cell1-db-create-mr6q4\" (UID: \"836ad386-beab-46db-b9a0-8c31ce0791ec\") " pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.532109 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2c6a-account-create-update-tnhvt"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.568970 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nltx8\" (UniqueName: \"kubernetes.io/projected/b3c26f6c-540e-46cf-abb4-48905651f901-kube-api-access-nltx8\") pod \"nova-cell0-d986-account-create-update-hhxgf\" (UID: \"b3c26f6c-540e-46cf-abb4-48905651f901\") " pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.569479 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3c26f6c-540e-46cf-abb4-48905651f901-operator-scripts\") pod \"nova-cell0-d986-account-create-update-hhxgf\" (UID: \"b3c26f6c-540e-46cf-abb4-48905651f901\") " pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.570740 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.570876 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3c26f6c-540e-46cf-abb4-48905651f901-operator-scripts\") pod \"nova-cell0-d986-account-create-update-hhxgf\" (UID: \"b3c26f6c-540e-46cf-abb4-48905651f901\") " pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.572904 4852 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.572934 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.572965 4852 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.572985 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jhrh\" (UniqueName: \"kubernetes.io/projected/e410ffd3-2d2c-4665-95fd-e20c287c3151-kube-api-access-4jhrh\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.573007 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.573020 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e410ffd3-2d2c-4665-95fd-e20c287c3151-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.581843 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e410ffd3-2d2c-4665-95fd-e20c287c3151" (UID: "e410ffd3-2d2c-4665-95fd-e20c287c3151"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.582432 4852 scope.go:117] "RemoveContainer" containerID="193f80cb3a9f9f3ddc44ffdd85d5ec927cc733119a9e5aca24aa455929ade979" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.598816 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nltx8\" (UniqueName: \"kubernetes.io/projected/b3c26f6c-540e-46cf-abb4-48905651f901-kube-api-access-nltx8\") pod \"nova-cell0-d986-account-create-update-hhxgf\" (UID: \"b3c26f6c-540e-46cf-abb4-48905651f901\") " pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.621547 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-config-data" (OuterVolumeSpecName: "config-data") pod "e410ffd3-2d2c-4665-95fd-e20c287c3151" (UID: "e410ffd3-2d2c-4665-95fd-e20c287c3151"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.633195 4852 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.652547 4852 scope.go:117] "RemoveContainer" containerID="0fc8af81b80689c6cd891634190a7237468e2d4b4c5a654a4d08da0af40a9c8b" Dec 10 12:14:37 crc kubenswrapper[4852]: W1210 12:14:37.667058 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod912732dc_e2fb_4acd_a463_69eb77ff5a6d.slice/crio-571d10d4b293f47f87ffa463ecdd968925cfa0b67adbaa971cebf2efb8a123c6 WatchSource:0}: Error finding container 571d10d4b293f47f87ffa463ecdd968925cfa0b67adbaa971cebf2efb8a123c6: Status 404 returned error can't find the container with id 571d10d4b293f47f87ffa463ecdd968925cfa0b67adbaa971cebf2efb8a123c6 Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.667620 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.681419 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkmhz\" (UniqueName: \"kubernetes.io/projected/008fada8-afb1-41ff-bd3a-06820f79cee6-kube-api-access-nkmhz\") pod \"nova-cell1-2c6a-account-create-update-tnhvt\" (UID: \"008fada8-afb1-41ff-bd3a-06820f79cee6\") " pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.681657 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008fada8-afb1-41ff-bd3a-06820f79cee6-operator-scripts\") pod \"nova-cell1-2c6a-account-create-update-tnhvt\" (UID: \"008fada8-afb1-41ff-bd3a-06820f79cee6\") " pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.681849 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.681864 4852 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e410ffd3-2d2c-4665-95fd-e20c287c3151-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.681873 4852 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.783898 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkmhz\" (UniqueName: \"kubernetes.io/projected/008fada8-afb1-41ff-bd3a-06820f79cee6-kube-api-access-nkmhz\") pod \"nova-cell1-2c6a-account-create-update-tnhvt\" (UID: \"008fada8-afb1-41ff-bd3a-06820f79cee6\") " pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.784258 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008fada8-afb1-41ff-bd3a-06820f79cee6-operator-scripts\") pod \"nova-cell1-2c6a-account-create-update-tnhvt\" (UID: \"008fada8-afb1-41ff-bd3a-06820f79cee6\") " pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.784970 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008fada8-afb1-41ff-bd3a-06820f79cee6-operator-scripts\") pod \"nova-cell1-2c6a-account-create-update-tnhvt\" (UID: \"008fada8-afb1-41ff-bd3a-06820f79cee6\") " pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.817072 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkmhz\" (UniqueName: \"kubernetes.io/projected/008fada8-afb1-41ff-bd3a-06820f79cee6-kube-api-access-nkmhz\") pod \"nova-cell1-2c6a-account-create-update-tnhvt\" (UID: \"008fada8-afb1-41ff-bd3a-06820f79cee6\") " pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.824072 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.844734 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.852288 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.854434 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.857861 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.859142 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.859305 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.883273 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.894073 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.958431 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kbz4r"] Dec 10 12:14:37 crc kubenswrapper[4852]: W1210 12:14:37.985785 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4df457a9_8045_4c39_abe3_31afc98aaa26.slice/crio-2d32109b3e013e5ae4ae29ed9a09c37d662106d9bd5b02c3ea1415e4b6b33e98 WatchSource:0}: Error finding container 2d32109b3e013e5ae4ae29ed9a09c37d662106d9bd5b02c3ea1415e4b6b33e98: Status 404 returned error can't find the container with id 2d32109b3e013e5ae4ae29ed9a09c37d662106d9bd5b02c3ea1415e4b6b33e98 Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.987493 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.987554 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.987587 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7719c76-46f2-456f-8e69-8becce7f3b9c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.987633 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7719c76-46f2-456f-8e69-8becce7f3b9c-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.987672 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.987723 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtfdn\" (UniqueName: \"kubernetes.io/projected/f7719c76-46f2-456f-8e69-8becce7f3b9c-kube-api-access-jtfdn\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.987751 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:37 crc kubenswrapper[4852]: I1210 12:14:37.987823 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.039859 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.090897 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtfdn\" (UniqueName: \"kubernetes.io/projected/f7719c76-46f2-456f-8e69-8becce7f3b9c-kube-api-access-jtfdn\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.090955 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.091040 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.091088 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.091116 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.091142 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7719c76-46f2-456f-8e69-8becce7f3b9c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.091187 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7719c76-46f2-456f-8e69-8becce7f3b9c-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.091227 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.095177 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.095383 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7719c76-46f2-456f-8e69-8becce7f3b9c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.098410 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.110098 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.111021 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7719c76-46f2-456f-8e69-8becce7f3b9c-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.120987 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.129197 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7719c76-46f2-456f-8e69-8becce7f3b9c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.154376 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qm45f"] Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.185150 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtfdn\" (UniqueName: \"kubernetes.io/projected/f7719c76-46f2-456f-8e69-8becce7f3b9c-kube-api-access-jtfdn\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: W1210 12:14:38.202942 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41e83be9_e570_4fe9_8a2d_e5a6fa941c24.slice/crio-c030e2f9ae77335fb09ed2eae95684000f3316f85f8703e120a2c9f39789ab6f WatchSource:0}: Error finding container c030e2f9ae77335fb09ed2eae95684000f3316f85f8703e120a2c9f39789ab6f: Status 404 returned error can't find the container with id c030e2f9ae77335fb09ed2eae95684000f3316f85f8703e120a2c9f39789ab6f Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.224475 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7719c76-46f2-456f-8e69-8becce7f3b9c\") " pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.300696 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dead70d-b3b6-4bee-a6cf-5e10924a3877" path="/var/lib/kubelet/pods/0dead70d-b3b6-4bee-a6cf-5e10924a3877/volumes" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.302898 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65bcb5c-dfe8-4412-8fbf-7e717ab28750" path="/var/lib/kubelet/pods/b65bcb5c-dfe8-4412-8fbf-7e717ab28750/volumes" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.316411 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e410ffd3-2d2c-4665-95fd-e20c287c3151" path="/var/lib/kubelet/pods/e410ffd3-2d2c-4665-95fd-e20c287c3151/volumes" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.319531 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mr6q4"] Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.319570 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2f56-account-create-update-mhhlb"] Dec 10 12:14:38 crc kubenswrapper[4852]: W1210 12:14:38.328364 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c3f835c_31cf_4330_8951_9a1e5414b839.slice/crio-28a98c0cbf0e9b9d705bcbd66d53c3558c89db958787689a3de0895fc3bc3ff7 WatchSource:0}: Error finding container 28a98c0cbf0e9b9d705bcbd66d53c3558c89db958787689a3de0895fc3bc3ff7: Status 404 returned error can't find the container with id 28a98c0cbf0e9b9d705bcbd66d53c3558c89db958787689a3de0895fc3bc3ff7 Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.400427 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.476102 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.495449 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"12600c57-0ba3-4781-93cc-317e533e52d8","Type":"ContainerStarted","Data":"076db2224ce797e94ed859e5b2b8ec43fb5cf892b381567a023b8925c636754a"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.505989 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mr6q4" event={"ID":"836ad386-beab-46db-b9a0-8c31ce0791ec","Type":"ContainerStarted","Data":"d1895296a64c6227ccdcc4b33a3ab9e347801fc37167a198e99460fe2cf798e1"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.507252 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-secret-key\") pod \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.507347 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-logs\") pod \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.507412 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-config-data\") pod \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.507468 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-combined-ca-bundle\") pod \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.507595 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-tls-certs\") pod \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.507652 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-scripts\") pod \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.507681 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmggl\" (UniqueName: \"kubernetes.io/projected/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-kube-api-access-fmggl\") pod \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\" (UID: \"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952\") " Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.509081 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-logs" (OuterVolumeSpecName: "logs") pod "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" (UID: "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.516087 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" (UID: "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.530945 4852 generic.go:334] "Generic (PLEG): container finished" podID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerID="cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f" exitCode=137 Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.531030 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-955f9866d-84pn5" event={"ID":"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952","Type":"ContainerDied","Data":"cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.531063 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-955f9866d-84pn5" event={"ID":"d2bdb4ea-d2a9-4974-9cc5-54b15afa1952","Type":"ContainerDied","Data":"78b45f3b4b325654e6a45842fe89a4f984dfc6f4e00ff0a217c80846b57097b6"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.531084 4852 scope.go:117] "RemoveContainer" containerID="7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.531209 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-955f9866d-84pn5" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.531375 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-kube-api-access-fmggl" (OuterVolumeSpecName: "kube-api-access-fmggl") pod "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" (UID: "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952"). InnerVolumeSpecName "kube-api-access-fmggl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.538928 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.698675976 podStartE2EDuration="16.538903682s" podCreationTimestamp="2025-12-10 12:14:22 +0000 UTC" firstStartedPulling="2025-12-10 12:14:23.20483685 +0000 UTC m=+1349.290362074" lastFinishedPulling="2025-12-10 12:14:37.045064546 +0000 UTC m=+1363.130589780" observedRunningTime="2025-12-10 12:14:38.515877436 +0000 UTC m=+1364.601402670" watchObservedRunningTime="2025-12-10 12:14:38.538903682 +0000 UTC m=+1364.624428926" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.542058 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qm45f" event={"ID":"41e83be9-e570-4fe9-8a2d-e5a6fa941c24","Type":"ContainerStarted","Data":"c030e2f9ae77335fb09ed2eae95684000f3316f85f8703e120a2c9f39789ab6f"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.543838 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerStarted","Data":"571d10d4b293f47f87ffa463ecdd968925cfa0b67adbaa971cebf2efb8a123c6"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.546122 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0edbc55b-f57a-46c0-9991-33d794c74319","Type":"ContainerStarted","Data":"17c2cb986de26615574ca200bfb7d2ba2e00e54cf059b39a30c7314a04bc36f1"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.550117 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" event={"ID":"a41546b5-9dd3-4400-97ba-4bf433dc2c2c","Type":"ContainerStarted","Data":"b42cabd12924f7d150d2f8cfea79f16cd4b0e35a71e627eafd1c85ec599e8e4b"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.551206 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.551278 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.556251 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f56-account-create-update-mhhlb" event={"ID":"9c3f835c-31cf-4330-8951-9a1e5414b839","Type":"ContainerStarted","Data":"28a98c0cbf0e9b9d705bcbd66d53c3558c89db958787689a3de0895fc3bc3ff7"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.586379 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2c6a-account-create-update-tnhvt"] Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.586898 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kbz4r" event={"ID":"4df457a9-8045-4c39-abe3-31afc98aaa26","Type":"ContainerStarted","Data":"2d32109b3e013e5ae4ae29ed9a09c37d662106d9bd5b02c3ea1415e4b6b33e98"} Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.617466 4852 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.617501 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.617511 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmggl\" (UniqueName: \"kubernetes.io/projected/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-kube-api-access-fmggl\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.633411 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" podStartSLOduration=10.633387462 podStartE2EDuration="10.633387462s" podCreationTimestamp="2025-12-10 12:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:38.577050955 +0000 UTC m=+1364.662576179" watchObservedRunningTime="2025-12-10 12:14:38.633387462 +0000 UTC m=+1364.718912696" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.637580 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" (UID: "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.649685 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-config-data" (OuterVolumeSpecName: "config-data") pod "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" (UID: "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.661051 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d986-account-create-update-hhxgf"] Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.699664 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-scripts" (OuterVolumeSpecName: "scripts") pod "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" (UID: "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.722947 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.723979 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.723998 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.899252 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" (UID: "d2bdb4ea-d2a9-4974-9cc5-54b15afa1952"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.928599 4852 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:38 crc kubenswrapper[4852]: I1210 12:14:38.966137 4852 scope.go:117] "RemoveContainer" containerID="cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f" Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.085575 4852 scope.go:117] "RemoveContainer" containerID="7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893" Dec 10 12:14:39 crc kubenswrapper[4852]: E1210 12:14:39.090438 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893\": container with ID starting with 7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893 not found: ID does not exist" containerID="7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893" Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.090485 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893"} err="failed to get container status \"7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893\": rpc error: code = NotFound desc = could not find container \"7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893\": container with ID starting with 7c9af4f2f0571f6ee7f6cd695ee6dc6dea2aeb6847027ba946b443e53eb04893 not found: ID does not exist" Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.090519 4852 scope.go:117] "RemoveContainer" containerID="cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f" Dec 10 12:14:39 crc kubenswrapper[4852]: E1210 12:14:39.090812 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f\": container with ID starting with cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f not found: ID does not exist" containerID="cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f" Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.090840 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f"} err="failed to get container status \"cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f\": rpc error: code = NotFound desc = could not find container \"cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f\": container with ID starting with cd151e5ec1c3769674d30ae366abd2299b303d8d113b077b3a3d46a3e8b07d3f not found: ID does not exist" Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.092418 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.245298 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-955f9866d-84pn5"] Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.291620 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-955f9866d-84pn5"] Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.346593 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.599253 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" event={"ID":"008fada8-afb1-41ff-bd3a-06820f79cee6","Type":"ContainerStarted","Data":"97e034f382963c0f52d9c05f4b46392b24377f29d545539e770cd19eea926516"} Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.602640 4852 generic.go:334] "Generic (PLEG): container finished" podID="9c3f835c-31cf-4330-8951-9a1e5414b839" containerID="60d633394812bf8f700c533e5f47f001ffa1826ac3b9408dab6399e46836be4b" exitCode=0 Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.602718 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f56-account-create-update-mhhlb" event={"ID":"9c3f835c-31cf-4330-8951-9a1e5414b839","Type":"ContainerDied","Data":"60d633394812bf8f700c533e5f47f001ffa1826ac3b9408dab6399e46836be4b"} Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.605098 4852 generic.go:334] "Generic (PLEG): container finished" podID="836ad386-beab-46db-b9a0-8c31ce0791ec" containerID="6cad90b9bf867a69fb6135afc6bbfa70fa2487292751ced5187d378666904193" exitCode=0 Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.605143 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mr6q4" event={"ID":"836ad386-beab-46db-b9a0-8c31ce0791ec","Type":"ContainerDied","Data":"6cad90b9bf867a69fb6135afc6bbfa70fa2487292751ced5187d378666904193"} Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.643948 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7719c76-46f2-456f-8e69-8becce7f3b9c","Type":"ContainerStarted","Data":"a8670f7a30a9586602ab6fbdb65bdb2069276ab7e495f911267f2ad9531edebd"} Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.652263 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d986-account-create-update-hhxgf" event={"ID":"b3c26f6c-540e-46cf-abb4-48905651f901","Type":"ContainerStarted","Data":"c639b7bcb6c6328a79d4af9c4e7c4c03b7e6e7752c62316c642d05b4aa861c25"} Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.658102 4852 generic.go:334] "Generic (PLEG): container finished" podID="4df457a9-8045-4c39-abe3-31afc98aaa26" containerID="f3d341118041b7819d874f7b71b76002aad445ab9373bc106672f06ed4186b9f" exitCode=0 Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.658373 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kbz4r" event={"ID":"4df457a9-8045-4c39-abe3-31afc98aaa26","Type":"ContainerDied","Data":"f3d341118041b7819d874f7b71b76002aad445ab9373bc106672f06ed4186b9f"} Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.663267 4852 generic.go:334] "Generic (PLEG): container finished" podID="41e83be9-e570-4fe9-8a2d-e5a6fa941c24" containerID="965acc844e6a755369fb439ec0735e5b164260faf98a715206debc22f5f38d10" exitCode=0 Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.663331 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qm45f" event={"ID":"41e83be9-e570-4fe9-8a2d-e5a6fa941c24","Type":"ContainerDied","Data":"965acc844e6a755369fb439ec0735e5b164260faf98a715206debc22f5f38d10"} Dec 10 12:14:39 crc kubenswrapper[4852]: I1210 12:14:39.746890 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.190053 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" path="/var/lib/kubelet/pods/d2bdb4ea-d2a9-4974-9cc5-54b15afa1952/volumes" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.192455 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.272277 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06ad202-e036-4755-8869-d336483b8791-logs\") pod \"d06ad202-e036-4755-8869-d336483b8791\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.272321 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data-custom\") pod \"d06ad202-e036-4755-8869-d336483b8791\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.272462 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data\") pod \"d06ad202-e036-4755-8869-d336483b8791\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.272507 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhhrl\" (UniqueName: \"kubernetes.io/projected/d06ad202-e036-4755-8869-d336483b8791-kube-api-access-xhhrl\") pod \"d06ad202-e036-4755-8869-d336483b8791\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.272547 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-combined-ca-bundle\") pod \"d06ad202-e036-4755-8869-d336483b8791\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.272584 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-scripts\") pod \"d06ad202-e036-4755-8869-d336483b8791\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.272602 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06ad202-e036-4755-8869-d336483b8791-etc-machine-id\") pod \"d06ad202-e036-4755-8869-d336483b8791\" (UID: \"d06ad202-e036-4755-8869-d336483b8791\") " Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.272994 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06ad202-e036-4755-8869-d336483b8791-logs" (OuterVolumeSpecName: "logs") pod "d06ad202-e036-4755-8869-d336483b8791" (UID: "d06ad202-e036-4755-8869-d336483b8791"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.273136 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d06ad202-e036-4755-8869-d336483b8791-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d06ad202-e036-4755-8869-d336483b8791" (UID: "d06ad202-e036-4755-8869-d336483b8791"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.275469 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06ad202-e036-4755-8869-d336483b8791-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.294811 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-scripts" (OuterVolumeSpecName: "scripts") pod "d06ad202-e036-4755-8869-d336483b8791" (UID: "d06ad202-e036-4755-8869-d336483b8791"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.297474 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06ad202-e036-4755-8869-d336483b8791-kube-api-access-xhhrl" (OuterVolumeSpecName: "kube-api-access-xhhrl") pod "d06ad202-e036-4755-8869-d336483b8791" (UID: "d06ad202-e036-4755-8869-d336483b8791"). InnerVolumeSpecName "kube-api-access-xhhrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.313047 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d06ad202-e036-4755-8869-d336483b8791" (UID: "d06ad202-e036-4755-8869-d336483b8791"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.378748 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhhrl\" (UniqueName: \"kubernetes.io/projected/d06ad202-e036-4755-8869-d336483b8791-kube-api-access-xhhrl\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.379279 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.379385 4852 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06ad202-e036-4755-8869-d336483b8791-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.379505 4852 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.391123 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d06ad202-e036-4755-8869-d336483b8791" (UID: "d06ad202-e036-4755-8869-d336483b8791"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.405894 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data" (OuterVolumeSpecName: "config-data") pod "d06ad202-e036-4755-8869-d336483b8791" (UID: "d06ad202-e036-4755-8869-d336483b8791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.482480 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.482522 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06ad202-e036-4755-8869-d336483b8791-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.688482 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7719c76-46f2-456f-8e69-8becce7f3b9c","Type":"ContainerStarted","Data":"43978f6b2a2be392d0747f7c69ec36faace89a79e22cf0e22bb700bdfe5f493b"} Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.693487 4852 generic.go:334] "Generic (PLEG): container finished" podID="b3c26f6c-540e-46cf-abb4-48905651f901" containerID="4a06eb7c8ba9dc9c3071d49362dad47c119e26697c724fed172a559e0edcf793" exitCode=0 Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.693555 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d986-account-create-update-hhxgf" event={"ID":"b3c26f6c-540e-46cf-abb4-48905651f901","Type":"ContainerDied","Data":"4a06eb7c8ba9dc9c3071d49362dad47c119e26697c724fed172a559e0edcf793"} Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.705593 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerStarted","Data":"ac482ac18dd26eb9e6288d4b8befae5905404c133965037a1d5127ec8ac07a2a"} Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.713442 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0edbc55b-f57a-46c0-9991-33d794c74319","Type":"ContainerStarted","Data":"5022703b7dad8e274bfd06eac82a3c8033e0c12cc13bb54bd25defad650aef57"} Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.716744 4852 generic.go:334] "Generic (PLEG): container finished" podID="008fada8-afb1-41ff-bd3a-06820f79cee6" containerID="bca81ee791945c82d000819a72940012d8a3904b58c7709bd3fdcc688ad548ae" exitCode=0 Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.716793 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" event={"ID":"008fada8-afb1-41ff-bd3a-06820f79cee6","Type":"ContainerDied","Data":"bca81ee791945c82d000819a72940012d8a3904b58c7709bd3fdcc688ad548ae"} Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.726765 4852 generic.go:334] "Generic (PLEG): container finished" podID="d06ad202-e036-4755-8869-d336483b8791" containerID="5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60" exitCode=137 Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.727756 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.729132 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d06ad202-e036-4755-8869-d336483b8791","Type":"ContainerDied","Data":"5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60"} Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.729208 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d06ad202-e036-4755-8869-d336483b8791","Type":"ContainerDied","Data":"e0276610495377dc07a73e5dc5135f0d9b2f6481ee82d830db1a7333e998d6a3"} Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.729522 4852 scope.go:117] "RemoveContainer" containerID="5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.797066 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.859049 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.870319 4852 scope.go:117] "RemoveContainer" containerID="26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.876443 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:40 crc kubenswrapper[4852]: E1210 12:14:40.876887 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06ad202-e036-4755-8869-d336483b8791" containerName="cinder-api" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.876899 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06ad202-e036-4755-8869-d336483b8791" containerName="cinder-api" Dec 10 12:14:40 crc kubenswrapper[4852]: E1210 12:14:40.876913 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon-log" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.876919 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon-log" Dec 10 12:14:40 crc kubenswrapper[4852]: E1210 12:14:40.876934 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.876940 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon" Dec 10 12:14:40 crc kubenswrapper[4852]: E1210 12:14:40.876956 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06ad202-e036-4755-8869-d336483b8791" containerName="cinder-api-log" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.876963 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06ad202-e036-4755-8869-d336483b8791" containerName="cinder-api-log" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.877197 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06ad202-e036-4755-8869-d336483b8791" containerName="cinder-api" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.877220 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.877242 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06ad202-e036-4755-8869-d336483b8791" containerName="cinder-api-log" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.877256 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2bdb4ea-d2a9-4974-9cc5-54b15afa1952" containerName="horizon-log" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.878476 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.885285 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.886893 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.886957 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.887193 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.953343 4852 scope.go:117] "RemoveContainer" containerID="5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60" Dec 10 12:14:40 crc kubenswrapper[4852]: E1210 12:14:40.954351 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60\": container with ID starting with 5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60 not found: ID does not exist" containerID="5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.954381 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60"} err="failed to get container status \"5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60\": rpc error: code = NotFound desc = could not find container \"5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60\": container with ID starting with 5431aa45b51eb26c887ca9fc0edd30d266357e70e3473caff67bc272413f2e60 not found: ID does not exist" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.954407 4852 scope.go:117] "RemoveContainer" containerID="26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2" Dec 10 12:14:40 crc kubenswrapper[4852]: E1210 12:14:40.954699 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2\": container with ID starting with 26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2 not found: ID does not exist" containerID="26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.954725 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2"} err="failed to get container status \"26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2\": rpc error: code = NotFound desc = could not find container \"26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2\": container with ID starting with 26fab6edf2f06c869dc9a7a548be40489875012d2168da139d9ab4e84fa6b2a2 not found: ID does not exist" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994131 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ee70b55-95d5-4ea5-9626-a6482097668c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994198 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994310 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee70b55-95d5-4ea5-9626-a6482097668c-logs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994334 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994371 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-config-data-custom\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994395 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-scripts\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994462 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-config-data\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994487 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwfgk\" (UniqueName: \"kubernetes.io/projected/3ee70b55-95d5-4ea5-9626-a6482097668c-kube-api-access-dwfgk\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:40 crc kubenswrapper[4852]: I1210 12:14:40.994515 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.096098 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-config-data-custom\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097170 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-scripts\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097393 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-config-data\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097419 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwfgk\" (UniqueName: \"kubernetes.io/projected/3ee70b55-95d5-4ea5-9626-a6482097668c-kube-api-access-dwfgk\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097444 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097482 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ee70b55-95d5-4ea5-9626-a6482097668c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097501 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097569 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097588 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee70b55-95d5-4ea5-9626-a6482097668c-logs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.097978 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee70b55-95d5-4ea5-9626-a6482097668c-logs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.103352 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-config-data-custom\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.107693 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.111585 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-config-data\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.117990 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.118072 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ee70b55-95d5-4ea5-9626-a6482097668c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.118717 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwfgk\" (UniqueName: \"kubernetes.io/projected/3ee70b55-95d5-4ea5-9626-a6482097668c-kube-api-access-dwfgk\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.209410 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-scripts\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.209876 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee70b55-95d5-4ea5-9626-a6482097668c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3ee70b55-95d5-4ea5-9626-a6482097668c\") " pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.489978 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.501694 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.521192 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds4rb\" (UniqueName: \"kubernetes.io/projected/836ad386-beab-46db-b9a0-8c31ce0791ec-kube-api-access-ds4rb\") pod \"836ad386-beab-46db-b9a0-8c31ce0791ec\" (UID: \"836ad386-beab-46db-b9a0-8c31ce0791ec\") " Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.521310 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/836ad386-beab-46db-b9a0-8c31ce0791ec-operator-scripts\") pod \"836ad386-beab-46db-b9a0-8c31ce0791ec\" (UID: \"836ad386-beab-46db-b9a0-8c31ce0791ec\") " Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.522514 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836ad386-beab-46db-b9a0-8c31ce0791ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "836ad386-beab-46db-b9a0-8c31ce0791ec" (UID: "836ad386-beab-46db-b9a0-8c31ce0791ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.537660 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836ad386-beab-46db-b9a0-8c31ce0791ec-kube-api-access-ds4rb" (OuterVolumeSpecName: "kube-api-access-ds4rb") pod "836ad386-beab-46db-b9a0-8c31ce0791ec" (UID: "836ad386-beab-46db-b9a0-8c31ce0791ec"). InnerVolumeSpecName "kube-api-access-ds4rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.625624 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds4rb\" (UniqueName: \"kubernetes.io/projected/836ad386-beab-46db-b9a0-8c31ce0791ec-kube-api-access-ds4rb\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.625661 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/836ad386-beab-46db-b9a0-8c31ce0791ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.638515 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.643597 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.648374 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.726759 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnsb\" (UniqueName: \"kubernetes.io/projected/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-kube-api-access-rvnsb\") pod \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\" (UID: \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\") " Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.727081 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4df457a9-8045-4c39-abe3-31afc98aaa26-operator-scripts\") pod \"4df457a9-8045-4c39-abe3-31afc98aaa26\" (UID: \"4df457a9-8045-4c39-abe3-31afc98aaa26\") " Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.727131 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhhs4\" (UniqueName: \"kubernetes.io/projected/9c3f835c-31cf-4330-8951-9a1e5414b839-kube-api-access-xhhs4\") pod \"9c3f835c-31cf-4330-8951-9a1e5414b839\" (UID: \"9c3f835c-31cf-4330-8951-9a1e5414b839\") " Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.727181 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz2t6\" (UniqueName: \"kubernetes.io/projected/4df457a9-8045-4c39-abe3-31afc98aaa26-kube-api-access-nz2t6\") pod \"4df457a9-8045-4c39-abe3-31afc98aaa26\" (UID: \"4df457a9-8045-4c39-abe3-31afc98aaa26\") " Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.727346 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-operator-scripts\") pod \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\" (UID: \"41e83be9-e570-4fe9-8a2d-e5a6fa941c24\") " Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.727365 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3f835c-31cf-4330-8951-9a1e5414b839-operator-scripts\") pod \"9c3f835c-31cf-4330-8951-9a1e5414b839\" (UID: \"9c3f835c-31cf-4330-8951-9a1e5414b839\") " Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.728069 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4df457a9-8045-4c39-abe3-31afc98aaa26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4df457a9-8045-4c39-abe3-31afc98aaa26" (UID: "4df457a9-8045-4c39-abe3-31afc98aaa26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.728089 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41e83be9-e570-4fe9-8a2d-e5a6fa941c24" (UID: "41e83be9-e570-4fe9-8a2d-e5a6fa941c24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.728105 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3f835c-31cf-4330-8951-9a1e5414b839-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c3f835c-31cf-4330-8951-9a1e5414b839" (UID: "9c3f835c-31cf-4330-8951-9a1e5414b839"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.732917 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3f835c-31cf-4330-8951-9a1e5414b839-kube-api-access-xhhs4" (OuterVolumeSpecName: "kube-api-access-xhhs4") pod "9c3f835c-31cf-4330-8951-9a1e5414b839" (UID: "9c3f835c-31cf-4330-8951-9a1e5414b839"). InnerVolumeSpecName "kube-api-access-xhhs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.735797 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-kube-api-access-rvnsb" (OuterVolumeSpecName: "kube-api-access-rvnsb") pod "41e83be9-e570-4fe9-8a2d-e5a6fa941c24" (UID: "41e83be9-e570-4fe9-8a2d-e5a6fa941c24"). InnerVolumeSpecName "kube-api-access-rvnsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.739835 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df457a9-8045-4c39-abe3-31afc98aaa26-kube-api-access-nz2t6" (OuterVolumeSpecName: "kube-api-access-nz2t6") pod "4df457a9-8045-4c39-abe3-31afc98aaa26" (UID: "4df457a9-8045-4c39-abe3-31afc98aaa26"). InnerVolumeSpecName "kube-api-access-nz2t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.741860 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kbz4r" event={"ID":"4df457a9-8045-4c39-abe3-31afc98aaa26","Type":"ContainerDied","Data":"2d32109b3e013e5ae4ae29ed9a09c37d662106d9bd5b02c3ea1415e4b6b33e98"} Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.741890 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d32109b3e013e5ae4ae29ed9a09c37d662106d9bd5b02c3ea1415e4b6b33e98" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.741943 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kbz4r" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.753390 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qm45f" event={"ID":"41e83be9-e570-4fe9-8a2d-e5a6fa941c24","Type":"ContainerDied","Data":"c030e2f9ae77335fb09ed2eae95684000f3316f85f8703e120a2c9f39789ab6f"} Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.753436 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c030e2f9ae77335fb09ed2eae95684000f3316f85f8703e120a2c9f39789ab6f" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.753505 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qm45f" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.759297 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerStarted","Data":"2beaba334d4f987b6c6a3c8a9e0b734c50bf158c6c75d055a81a1f1dc2cde901"} Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.766785 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0edbc55b-f57a-46c0-9991-33d794c74319","Type":"ContainerStarted","Data":"18c919401ecb225f40c56e2ea11b11d9e597e8f2e03e6e5d12aaf5f22c7d55cf"} Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.779752 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f56-account-create-update-mhhlb" event={"ID":"9c3f835c-31cf-4330-8951-9a1e5414b839","Type":"ContainerDied","Data":"28a98c0cbf0e9b9d705bcbd66d53c3558c89db958787689a3de0895fc3bc3ff7"} Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.779802 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28a98c0cbf0e9b9d705bcbd66d53c3558c89db958787689a3de0895fc3bc3ff7" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.779829 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f56-account-create-update-mhhlb" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.793550 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mr6q4" event={"ID":"836ad386-beab-46db-b9a0-8c31ce0791ec","Type":"ContainerDied","Data":"d1895296a64c6227ccdcc4b33a3ab9e347801fc37167a198e99460fe2cf798e1"} Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.793597 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1895296a64c6227ccdcc4b33a3ab9e347801fc37167a198e99460fe2cf798e1" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.793667 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mr6q4" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.805382 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7719c76-46f2-456f-8e69-8becce7f3b9c","Type":"ContainerStarted","Data":"3c99e36d13055352b0d2bc9f734986d6b6d7fb4b1dea608e458605f90a7a12f4"} Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.829874 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhhs4\" (UniqueName: \"kubernetes.io/projected/9c3f835c-31cf-4330-8951-9a1e5414b839-kube-api-access-xhhs4\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.829904 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz2t6\" (UniqueName: \"kubernetes.io/projected/4df457a9-8045-4c39-abe3-31afc98aaa26-kube-api-access-nz2t6\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.829914 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.829922 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3f835c-31cf-4330-8951-9a1e5414b839-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.829932 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnsb\" (UniqueName: \"kubernetes.io/projected/41e83be9-e570-4fe9-8a2d-e5a6fa941c24-kube-api-access-rvnsb\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.829941 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4df457a9-8045-4c39-abe3-31afc98aaa26-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.847461 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.847436993 podStartE2EDuration="5.847436993s" podCreationTimestamp="2025-12-10 12:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:41.791275879 +0000 UTC m=+1367.876801103" watchObservedRunningTime="2025-12-10 12:14:41.847436993 +0000 UTC m=+1367.932962227" Dec 10 12:14:41 crc kubenswrapper[4852]: I1210 12:14:41.849367 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.849353911 podStartE2EDuration="4.849353911s" podCreationTimestamp="2025-12-10 12:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:41.837986777 +0000 UTC m=+1367.923512001" watchObservedRunningTime="2025-12-10 12:14:41.849353911 +0000 UTC m=+1367.934879135" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.149665 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 10 12:14:42 crc kubenswrapper[4852]: W1210 12:14:42.179470 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ee70b55_95d5_4ea5_9626_a6482097668c.slice/crio-fdb8831cddccb790df308e457b4c01eba50c3750d5f7d9bd45de8656c9765e7a WatchSource:0}: Error finding container fdb8831cddccb790df308e457b4c01eba50c3750d5f7d9bd45de8656c9765e7a: Status 404 returned error can't find the container with id fdb8831cddccb790df308e457b4c01eba50c3750d5f7d9bd45de8656c9765e7a Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.200348 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06ad202-e036-4755-8869-d336483b8791" path="/var/lib/kubelet/pods/d06ad202-e036-4755-8869-d336483b8791/volumes" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.382754 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.392574 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.569717 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkmhz\" (UniqueName: \"kubernetes.io/projected/008fada8-afb1-41ff-bd3a-06820f79cee6-kube-api-access-nkmhz\") pod \"008fada8-afb1-41ff-bd3a-06820f79cee6\" (UID: \"008fada8-afb1-41ff-bd3a-06820f79cee6\") " Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.569917 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nltx8\" (UniqueName: \"kubernetes.io/projected/b3c26f6c-540e-46cf-abb4-48905651f901-kube-api-access-nltx8\") pod \"b3c26f6c-540e-46cf-abb4-48905651f901\" (UID: \"b3c26f6c-540e-46cf-abb4-48905651f901\") " Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.569992 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3c26f6c-540e-46cf-abb4-48905651f901-operator-scripts\") pod \"b3c26f6c-540e-46cf-abb4-48905651f901\" (UID: \"b3c26f6c-540e-46cf-abb4-48905651f901\") " Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.570129 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008fada8-afb1-41ff-bd3a-06820f79cee6-operator-scripts\") pod \"008fada8-afb1-41ff-bd3a-06820f79cee6\" (UID: \"008fada8-afb1-41ff-bd3a-06820f79cee6\") " Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.570606 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/008fada8-afb1-41ff-bd3a-06820f79cee6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "008fada8-afb1-41ff-bd3a-06820f79cee6" (UID: "008fada8-afb1-41ff-bd3a-06820f79cee6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.570601 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3c26f6c-540e-46cf-abb4-48905651f901-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b3c26f6c-540e-46cf-abb4-48905651f901" (UID: "b3c26f6c-540e-46cf-abb4-48905651f901"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.571040 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3c26f6c-540e-46cf-abb4-48905651f901-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.571075 4852 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008fada8-afb1-41ff-bd3a-06820f79cee6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.578576 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008fada8-afb1-41ff-bd3a-06820f79cee6-kube-api-access-nkmhz" (OuterVolumeSpecName: "kube-api-access-nkmhz") pod "008fada8-afb1-41ff-bd3a-06820f79cee6" (UID: "008fada8-afb1-41ff-bd3a-06820f79cee6"). InnerVolumeSpecName "kube-api-access-nkmhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.578608 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3c26f6c-540e-46cf-abb4-48905651f901-kube-api-access-nltx8" (OuterVolumeSpecName: "kube-api-access-nltx8") pod "b3c26f6c-540e-46cf-abb4-48905651f901" (UID: "b3c26f6c-540e-46cf-abb4-48905651f901"). InnerVolumeSpecName "kube-api-access-nltx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.673127 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkmhz\" (UniqueName: \"kubernetes.io/projected/008fada8-afb1-41ff-bd3a-06820f79cee6-kube-api-access-nkmhz\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.673169 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nltx8\" (UniqueName: \"kubernetes.io/projected/b3c26f6c-540e-46cf-abb4-48905651f901-kube-api-access-nltx8\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.817284 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3ee70b55-95d5-4ea5-9626-a6482097668c","Type":"ContainerStarted","Data":"fdb8831cddccb790df308e457b4c01eba50c3750d5f7d9bd45de8656c9765e7a"} Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.819790 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d986-account-create-update-hhxgf" event={"ID":"b3c26f6c-540e-46cf-abb4-48905651f901","Type":"ContainerDied","Data":"c639b7bcb6c6328a79d4af9c4e7c4c03b7e6e7752c62316c642d05b4aa861c25"} Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.819834 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c639b7bcb6c6328a79d4af9c4e7c4c03b7e6e7752c62316c642d05b4aa861c25" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.819831 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d986-account-create-update-hhxgf" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.824853 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerStarted","Data":"6f69fc189608d721cb666cde8c3dfd0f6d22aad68377f4e17538400dbb12fb37"} Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.827149 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.827199 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2c6a-account-create-update-tnhvt" event={"ID":"008fada8-afb1-41ff-bd3a-06820f79cee6","Type":"ContainerDied","Data":"97e034f382963c0f52d9c05f4b46392b24377f29d545539e770cd19eea926516"} Dec 10 12:14:42 crc kubenswrapper[4852]: I1210 12:14:42.827224 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97e034f382963c0f52d9c05f4b46392b24377f29d545539e770cd19eea926516" Dec 10 12:14:43 crc kubenswrapper[4852]: I1210 12:14:43.835793 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3ee70b55-95d5-4ea5-9626-a6482097668c","Type":"ContainerStarted","Data":"0aa1aff5ed180b51d6b8dd5c0af7a914ee88666e1fc79a9b2723f9145afd9ede"} Dec 10 12:14:43 crc kubenswrapper[4852]: I1210 12:14:43.836032 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3ee70b55-95d5-4ea5-9626-a6482097668c","Type":"ContainerStarted","Data":"381b52d098868ebb0b3aef6c9c04bddc8f0009ece994d8054d9542dce02c9b5c"} Dec 10 12:14:43 crc kubenswrapper[4852]: I1210 12:14:43.836085 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 10 12:14:43 crc kubenswrapper[4852]: I1210 12:14:43.883722 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.883693412 podStartE2EDuration="3.883693412s" podCreationTimestamp="2025-12-10 12:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:14:43.853328043 +0000 UTC m=+1369.938853267" watchObservedRunningTime="2025-12-10 12:14:43.883693412 +0000 UTC m=+1369.969218636" Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.263647 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.264150 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-8698bf8cd7-bmf4z" Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.845713 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerStarted","Data":"d08fa1671f2f5c0b2e0350aa3c1dbfe61eda3a4a85e3d08b710f591f995a9445"} Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.845939 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="ceilometer-central-agent" containerID="cri-o://ac482ac18dd26eb9e6288d4b8befae5905404c133965037a1d5127ec8ac07a2a" gracePeriod=30 Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.845989 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.846060 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="proxy-httpd" containerID="cri-o://d08fa1671f2f5c0b2e0350aa3c1dbfe61eda3a4a85e3d08b710f591f995a9445" gracePeriod=30 Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.846114 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="sg-core" containerID="cri-o://6f69fc189608d721cb666cde8c3dfd0f6d22aad68377f4e17538400dbb12fb37" gracePeriod=30 Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.846157 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="ceilometer-notification-agent" containerID="cri-o://2beaba334d4f987b6c6a3c8a9e0b734c50bf158c6c75d055a81a1f1dc2cde901" gracePeriod=30 Dec 10 12:14:44 crc kubenswrapper[4852]: I1210 12:14:44.871521 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.876245702 podStartE2EDuration="8.871504455s" podCreationTimestamp="2025-12-10 12:14:36 +0000 UTC" firstStartedPulling="2025-12-10 12:14:37.674504504 +0000 UTC m=+1363.760029728" lastFinishedPulling="2025-12-10 12:14:43.669763257 +0000 UTC m=+1369.755288481" observedRunningTime="2025-12-10 12:14:44.866261774 +0000 UTC m=+1370.951786988" watchObservedRunningTime="2025-12-10 12:14:44.871504455 +0000 UTC m=+1370.957029679" Dec 10 12:14:45 crc kubenswrapper[4852]: I1210 12:14:45.858488 4852 generic.go:334] "Generic (PLEG): container finished" podID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerID="d08fa1671f2f5c0b2e0350aa3c1dbfe61eda3a4a85e3d08b710f591f995a9445" exitCode=0 Dec 10 12:14:45 crc kubenswrapper[4852]: I1210 12:14:45.858900 4852 generic.go:334] "Generic (PLEG): container finished" podID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerID="6f69fc189608d721cb666cde8c3dfd0f6d22aad68377f4e17538400dbb12fb37" exitCode=2 Dec 10 12:14:45 crc kubenswrapper[4852]: I1210 12:14:45.858913 4852 generic.go:334] "Generic (PLEG): container finished" podID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerID="2beaba334d4f987b6c6a3c8a9e0b734c50bf158c6c75d055a81a1f1dc2cde901" exitCode=0 Dec 10 12:14:45 crc kubenswrapper[4852]: I1210 12:14:45.858922 4852 generic.go:334] "Generic (PLEG): container finished" podID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerID="ac482ac18dd26eb9e6288d4b8befae5905404c133965037a1d5127ec8ac07a2a" exitCode=0 Dec 10 12:14:45 crc kubenswrapper[4852]: I1210 12:14:45.858528 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerDied","Data":"d08fa1671f2f5c0b2e0350aa3c1dbfe61eda3a4a85e3d08b710f591f995a9445"} Dec 10 12:14:45 crc kubenswrapper[4852]: I1210 12:14:45.858952 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerDied","Data":"6f69fc189608d721cb666cde8c3dfd0f6d22aad68377f4e17538400dbb12fb37"} Dec 10 12:14:45 crc kubenswrapper[4852]: I1210 12:14:45.858968 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerDied","Data":"2beaba334d4f987b6c6a3c8a9e0b734c50bf158c6c75d055a81a1f1dc2cde901"} Dec 10 12:14:45 crc kubenswrapper[4852]: I1210 12:14:45.858977 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerDied","Data":"ac482ac18dd26eb9e6288d4b8befae5905404c133965037a1d5127ec8ac07a2a"} Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.146209 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.192039 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.192086 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.240886 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.245173 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.259814 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njjc7\" (UniqueName: \"kubernetes.io/projected/912732dc-e2fb-4acd-a463-69eb77ff5a6d-kube-api-access-njjc7\") pod \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.260976 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-scripts\") pod \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.261050 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-run-httpd\") pod \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.261080 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-log-httpd\") pod \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.261144 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-combined-ca-bundle\") pod \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.261188 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-config-data\") pod \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.261333 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-sg-core-conf-yaml\") pod \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\" (UID: \"912732dc-e2fb-4acd-a463-69eb77ff5a6d\") " Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.261622 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "912732dc-e2fb-4acd-a463-69eb77ff5a6d" (UID: "912732dc-e2fb-4acd-a463-69eb77ff5a6d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.261924 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "912732dc-e2fb-4acd-a463-69eb77ff5a6d" (UID: "912732dc-e2fb-4acd-a463-69eb77ff5a6d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.262361 4852 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.262386 4852 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/912732dc-e2fb-4acd-a463-69eb77ff5a6d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.268703 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-scripts" (OuterVolumeSpecName: "scripts") pod "912732dc-e2fb-4acd-a463-69eb77ff5a6d" (UID: "912732dc-e2fb-4acd-a463-69eb77ff5a6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.276707 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/912732dc-e2fb-4acd-a463-69eb77ff5a6d-kube-api-access-njjc7" (OuterVolumeSpecName: "kube-api-access-njjc7") pod "912732dc-e2fb-4acd-a463-69eb77ff5a6d" (UID: "912732dc-e2fb-4acd-a463-69eb77ff5a6d"). InnerVolumeSpecName "kube-api-access-njjc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.301792 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "912732dc-e2fb-4acd-a463-69eb77ff5a6d" (UID: "912732dc-e2fb-4acd-a463-69eb77ff5a6d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.364187 4852 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.364366 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njjc7\" (UniqueName: \"kubernetes.io/projected/912732dc-e2fb-4acd-a463-69eb77ff5a6d-kube-api-access-njjc7\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.364440 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.372472 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "912732dc-e2fb-4acd-a463-69eb77ff5a6d" (UID: "912732dc-e2fb-4acd-a463-69eb77ff5a6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.412440 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-config-data" (OuterVolumeSpecName: "config-data") pod "912732dc-e2fb-4acd-a463-69eb77ff5a6d" (UID: "912732dc-e2fb-4acd-a463-69eb77ff5a6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.466437 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.466466 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/912732dc-e2fb-4acd-a463-69eb77ff5a6d-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.634897 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7h6"] Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635356 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df457a9-8045-4c39-abe3-31afc98aaa26" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635379 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df457a9-8045-4c39-abe3-31afc98aaa26" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635395 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="ceilometer-central-agent" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635403 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="ceilometer-central-agent" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635421 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836ad386-beab-46db-b9a0-8c31ce0791ec" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635428 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="836ad386-beab-46db-b9a0-8c31ce0791ec" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635438 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e83be9-e570-4fe9-8a2d-e5a6fa941c24" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635444 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e83be9-e570-4fe9-8a2d-e5a6fa941c24" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635474 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="proxy-httpd" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635482 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="proxy-httpd" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635495 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3c26f6c-540e-46cf-abb4-48905651f901" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635502 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c26f6c-540e-46cf-abb4-48905651f901" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635517 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008fada8-afb1-41ff-bd3a-06820f79cee6" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635524 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="008fada8-afb1-41ff-bd3a-06820f79cee6" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635536 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="sg-core" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635543 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="sg-core" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635554 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="ceilometer-notification-agent" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635561 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="ceilometer-notification-agent" Dec 10 12:14:47 crc kubenswrapper[4852]: E1210 12:14:47.635576 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3f835c-31cf-4330-8951-9a1e5414b839" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635585 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3f835c-31cf-4330-8951-9a1e5414b839" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635773 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="ceilometer-central-agent" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635794 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="proxy-httpd" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635806 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="ceilometer-notification-agent" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635818 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" containerName="sg-core" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635839 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e83be9-e570-4fe9-8a2d-e5a6fa941c24" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635848 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="008fada8-afb1-41ff-bd3a-06820f79cee6" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635858 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="836ad386-beab-46db-b9a0-8c31ce0791ec" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635870 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3c26f6c-540e-46cf-abb4-48905651f901" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635879 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3f835c-31cf-4330-8951-9a1e5414b839" containerName="mariadb-account-create-update" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.635890 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="4df457a9-8045-4c39-abe3-31afc98aaa26" containerName="mariadb-database-create" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.636939 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.639336 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.639358 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.639616 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-45f4l" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.647546 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7h6"] Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.771370 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmh26\" (UniqueName: \"kubernetes.io/projected/f59c1e79-8117-43c8-bd38-0f1f144271a5-kube-api-access-mmh26\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.771528 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-config-data\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.771722 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-scripts\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.771793 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.873412 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-config-data\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.873501 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-scripts\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.873540 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.873571 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmh26\" (UniqueName: \"kubernetes.io/projected/f59c1e79-8117-43c8-bd38-0f1f144271a5-kube-api-access-mmh26\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.877461 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-scripts\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.877607 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-config-data\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.879072 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.879479 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"912732dc-e2fb-4acd-a463-69eb77ff5a6d","Type":"ContainerDied","Data":"571d10d4b293f47f87ffa463ecdd968925cfa0b67adbaa971cebf2efb8a123c6"} Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.879534 4852 scope.go:117] "RemoveContainer" containerID="d08fa1671f2f5c0b2e0350aa3c1dbfe61eda3a4a85e3d08b710f591f995a9445" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.880336 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.880370 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.880405 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.899949 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmh26\" (UniqueName: \"kubernetes.io/projected/f59c1e79-8117-43c8-bd38-0f1f144271a5-kube-api-access-mmh26\") pod \"nova-cell0-conductor-db-sync-zb7h6\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.909993 4852 scope.go:117] "RemoveContainer" containerID="6f69fc189608d721cb666cde8c3dfd0f6d22aad68377f4e17538400dbb12fb37" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.928100 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.944567 4852 scope.go:117] "RemoveContainer" containerID="2beaba334d4f987b6c6a3c8a9e0b734c50bf158c6c75d055a81a1f1dc2cde901" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.945075 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.958581 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.962485 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.964539 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.965076 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.965723 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.972670 4852 scope.go:117] "RemoveContainer" containerID="ac482ac18dd26eb9e6288d4b8befae5905404c133965037a1d5127ec8ac07a2a" Dec 10 12:14:47 crc kubenswrapper[4852]: I1210 12:14:47.991001 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.079550 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-config-data\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.080151 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-run-httpd\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.080188 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.080227 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-log-httpd\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.080272 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzm77\" (UniqueName: \"kubernetes.io/projected/8c46da6f-a477-4e9e-99b0-c6911589d538-kube-api-access-pzm77\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.080371 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-scripts\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.080400 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.182470 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-config-data\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.182739 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-run-httpd\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.182776 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.182815 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-log-httpd\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.182840 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzm77\" (UniqueName: \"kubernetes.io/projected/8c46da6f-a477-4e9e-99b0-c6911589d538-kube-api-access-pzm77\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.182923 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-scripts\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.182947 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.183846 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-run-httpd\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.183856 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-log-httpd\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.190541 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-config-data\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.192680 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-scripts\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.197510 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="912732dc-e2fb-4acd-a463-69eb77ff5a6d" path="/var/lib/kubelet/pods/912732dc-e2fb-4acd-a463-69eb77ff5a6d/volumes" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.206046 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.206960 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzm77\" (UniqueName: \"kubernetes.io/projected/8c46da6f-a477-4e9e-99b0-c6911589d538-kube-api-access-pzm77\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.210784 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.288886 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.476336 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.476401 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.497913 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7h6"] Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.528540 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.542725 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.832850 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:14:48 crc kubenswrapper[4852]: W1210 12:14:48.835019 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c46da6f_a477_4e9e_99b0_c6911589d538.slice/crio-a396f8b57694420c91aec2a8473467c22cd35f099080b0e5507550b4e15d53a5 WatchSource:0}: Error finding container a396f8b57694420c91aec2a8473467c22cd35f099080b0e5507550b4e15d53a5: Status 404 returned error can't find the container with id a396f8b57694420c91aec2a8473467c22cd35f099080b0e5507550b4e15d53a5 Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.890658 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerStarted","Data":"a396f8b57694420c91aec2a8473467c22cd35f099080b0e5507550b4e15d53a5"} Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.893819 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" event={"ID":"f59c1e79-8117-43c8-bd38-0f1f144271a5","Type":"ContainerStarted","Data":"a3ac7e769d3de62c661e65ec7b628ef75bdd3c1d9cac6f27c65c71487a1f99d0"} Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.897240 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:48 crc kubenswrapper[4852]: I1210 12:14:48.897268 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:49 crc kubenswrapper[4852]: I1210 12:14:49.913861 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerStarted","Data":"b8a7581a17970c2f6e522fdba50e5b73f77d9bb7ce15c738e0e03c9b4a806715"} Dec 10 12:14:49 crc kubenswrapper[4852]: I1210 12:14:49.913905 4852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 10 12:14:49 crc kubenswrapper[4852]: I1210 12:14:49.915998 4852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 10 12:14:50 crc kubenswrapper[4852]: I1210 12:14:50.298589 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 10 12:14:50 crc kubenswrapper[4852]: I1210 12:14:50.339122 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 10 12:14:50 crc kubenswrapper[4852]: I1210 12:14:50.921935 4852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 10 12:14:50 crc kubenswrapper[4852]: I1210 12:14:50.923365 4852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 10 12:14:51 crc kubenswrapper[4852]: I1210 12:14:51.271149 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:51 crc kubenswrapper[4852]: I1210 12:14:51.272085 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 10 12:14:51 crc kubenswrapper[4852]: I1210 12:14:51.935577 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerStarted","Data":"56491f2b1a48f2615c49d9453b549cc73bbc034128686e46e7baa1087db0f7dd"} Dec 10 12:14:53 crc kubenswrapper[4852]: I1210 12:14:53.868437 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 10 12:14:53 crc kubenswrapper[4852]: I1210 12:14:53.958801 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerStarted","Data":"dca04ce64993fba2a20f84f740c64b809e1b5349d37dc97410669c30456cb7d4"} Dec 10 12:14:54 crc kubenswrapper[4852]: I1210 12:14:54.640989 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.146205 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4"] Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.148071 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.150320 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.150319 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.161739 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4"] Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.256120 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtmr\" (UniqueName: \"kubernetes.io/projected/f07657eb-924e-43f3-b088-c851f8c62424-kube-api-access-8xtmr\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.256686 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f07657eb-924e-43f3-b088-c851f8c62424-config-volume\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.256789 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f07657eb-924e-43f3-b088-c851f8c62424-secret-volume\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.358658 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtmr\" (UniqueName: \"kubernetes.io/projected/f07657eb-924e-43f3-b088-c851f8c62424-kube-api-access-8xtmr\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.358756 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f07657eb-924e-43f3-b088-c851f8c62424-config-volume\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.358800 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f07657eb-924e-43f3-b088-c851f8c62424-secret-volume\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.359914 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f07657eb-924e-43f3-b088-c851f8c62424-config-volume\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.374285 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f07657eb-924e-43f3-b088-c851f8c62424-secret-volume\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.374653 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtmr\" (UniqueName: \"kubernetes.io/projected/f07657eb-924e-43f3-b088-c851f8c62424-kube-api-access-8xtmr\") pod \"collect-profiles-29422815-pvrg4\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:00 crc kubenswrapper[4852]: I1210 12:15:00.480412 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:02 crc kubenswrapper[4852]: I1210 12:15:02.737870 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4"] Dec 10 12:15:02 crc kubenswrapper[4852]: W1210 12:15:02.756035 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07657eb_924e_43f3_b088_c851f8c62424.slice/crio-ad5242c681f28d1fa49254d33a2b9247c8b42e16c8b54e351ccb4a09fe122796 WatchSource:0}: Error finding container ad5242c681f28d1fa49254d33a2b9247c8b42e16c8b54e351ccb4a09fe122796: Status 404 returned error can't find the container with id ad5242c681f28d1fa49254d33a2b9247c8b42e16c8b54e351ccb4a09fe122796 Dec 10 12:15:03 crc kubenswrapper[4852]: I1210 12:15:03.164689 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" event={"ID":"f59c1e79-8117-43c8-bd38-0f1f144271a5","Type":"ContainerStarted","Data":"e7f127cfc23767857e99bf4d219bdc65b7ff545da4bc7a33f028fff5ba930b28"} Dec 10 12:15:03 crc kubenswrapper[4852]: I1210 12:15:03.166150 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" event={"ID":"f07657eb-924e-43f3-b088-c851f8c62424","Type":"ContainerStarted","Data":"ad5242c681f28d1fa49254d33a2b9247c8b42e16c8b54e351ccb4a09fe122796"} Dec 10 12:15:03 crc kubenswrapper[4852]: I1210 12:15:03.169749 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerStarted","Data":"fd22a16de6630b6271a2a55e315660a554c51fd64b9a509bccb0ee8b6b098acc"} Dec 10 12:15:03 crc kubenswrapper[4852]: I1210 12:15:03.187660 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" podStartSLOduration=2.639506094 podStartE2EDuration="16.1876382s" podCreationTimestamp="2025-12-10 12:14:47 +0000 UTC" firstStartedPulling="2025-12-10 12:14:48.504186415 +0000 UTC m=+1374.589711639" lastFinishedPulling="2025-12-10 12:15:02.052318521 +0000 UTC m=+1388.137843745" observedRunningTime="2025-12-10 12:15:03.183483436 +0000 UTC m=+1389.269008670" watchObservedRunningTime="2025-12-10 12:15:03.1876382 +0000 UTC m=+1389.273163444" Dec 10 12:15:04 crc kubenswrapper[4852]: I1210 12:15:04.181934 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" event={"ID":"f07657eb-924e-43f3-b088-c851f8c62424","Type":"ContainerStarted","Data":"5cd5645fea72c98047c9c3dcfa0b983b1a975a5e273e345550e7cc606fec5f2e"} Dec 10 12:15:06 crc kubenswrapper[4852]: I1210 12:15:06.203927 4852 generic.go:334] "Generic (PLEG): container finished" podID="f07657eb-924e-43f3-b088-c851f8c62424" containerID="5cd5645fea72c98047c9c3dcfa0b983b1a975a5e273e345550e7cc606fec5f2e" exitCode=0 Dec 10 12:15:06 crc kubenswrapper[4852]: I1210 12:15:06.204058 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" event={"ID":"f07657eb-924e-43f3-b088-c851f8c62424","Type":"ContainerDied","Data":"5cd5645fea72c98047c9c3dcfa0b983b1a975a5e273e345550e7cc606fec5f2e"} Dec 10 12:15:06 crc kubenswrapper[4852]: I1210 12:15:06.204449 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="ceilometer-central-agent" containerID="cri-o://b8a7581a17970c2f6e522fdba50e5b73f77d9bb7ce15c738e0e03c9b4a806715" gracePeriod=30 Dec 10 12:15:06 crc kubenswrapper[4852]: I1210 12:15:06.204475 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="sg-core" containerID="cri-o://dca04ce64993fba2a20f84f740c64b809e1b5349d37dc97410669c30456cb7d4" gracePeriod=30 Dec 10 12:15:06 crc kubenswrapper[4852]: I1210 12:15:06.204515 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 10 12:15:06 crc kubenswrapper[4852]: I1210 12:15:06.204530 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="ceilometer-notification-agent" containerID="cri-o://56491f2b1a48f2615c49d9453b549cc73bbc034128686e46e7baa1087db0f7dd" gracePeriod=30 Dec 10 12:15:06 crc kubenswrapper[4852]: I1210 12:15:06.204594 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="proxy-httpd" containerID="cri-o://fd22a16de6630b6271a2a55e315660a554c51fd64b9a509bccb0ee8b6b098acc" gracePeriod=30 Dec 10 12:15:06 crc kubenswrapper[4852]: I1210 12:15:06.257323 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.463554947 podStartE2EDuration="19.257297111s" podCreationTimestamp="2025-12-10 12:14:47 +0000 UTC" firstStartedPulling="2025-12-10 12:14:48.841855262 +0000 UTC m=+1374.927380486" lastFinishedPulling="2025-12-10 12:15:02.635597426 +0000 UTC m=+1388.721122650" observedRunningTime="2025-12-10 12:15:06.246009199 +0000 UTC m=+1392.331534423" watchObservedRunningTime="2025-12-10 12:15:06.257297111 +0000 UTC m=+1392.342822355" Dec 10 12:15:07 crc kubenswrapper[4852]: I1210 12:15:07.216411 4852 generic.go:334] "Generic (PLEG): container finished" podID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerID="fd22a16de6630b6271a2a55e315660a554c51fd64b9a509bccb0ee8b6b098acc" exitCode=0 Dec 10 12:15:07 crc kubenswrapper[4852]: I1210 12:15:07.216448 4852 generic.go:334] "Generic (PLEG): container finished" podID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerID="dca04ce64993fba2a20f84f740c64b809e1b5349d37dc97410669c30456cb7d4" exitCode=2 Dec 10 12:15:07 crc kubenswrapper[4852]: I1210 12:15:07.216482 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerDied","Data":"fd22a16de6630b6271a2a55e315660a554c51fd64b9a509bccb0ee8b6b098acc"} Dec 10 12:15:07 crc kubenswrapper[4852]: I1210 12:15:07.216528 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerDied","Data":"dca04ce64993fba2a20f84f740c64b809e1b5349d37dc97410669c30456cb7d4"} Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.411167 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.528883 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f07657eb-924e-43f3-b088-c851f8c62424-secret-volume\") pod \"f07657eb-924e-43f3-b088-c851f8c62424\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.529140 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f07657eb-924e-43f3-b088-c851f8c62424-config-volume\") pod \"f07657eb-924e-43f3-b088-c851f8c62424\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.529225 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xtmr\" (UniqueName: \"kubernetes.io/projected/f07657eb-924e-43f3-b088-c851f8c62424-kube-api-access-8xtmr\") pod \"f07657eb-924e-43f3-b088-c851f8c62424\" (UID: \"f07657eb-924e-43f3-b088-c851f8c62424\") " Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.529552 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07657eb-924e-43f3-b088-c851f8c62424-config-volume" (OuterVolumeSpecName: "config-volume") pod "f07657eb-924e-43f3-b088-c851f8c62424" (UID: "f07657eb-924e-43f3-b088-c851f8c62424"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.529726 4852 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f07657eb-924e-43f3-b088-c851f8c62424-config-volume\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.535082 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07657eb-924e-43f3-b088-c851f8c62424-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f07657eb-924e-43f3-b088-c851f8c62424" (UID: "f07657eb-924e-43f3-b088-c851f8c62424"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.535306 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07657eb-924e-43f3-b088-c851f8c62424-kube-api-access-8xtmr" (OuterVolumeSpecName: "kube-api-access-8xtmr") pod "f07657eb-924e-43f3-b088-c851f8c62424" (UID: "f07657eb-924e-43f3-b088-c851f8c62424"). InnerVolumeSpecName "kube-api-access-8xtmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.631371 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xtmr\" (UniqueName: \"kubernetes.io/projected/f07657eb-924e-43f3-b088-c851f8c62424-kube-api-access-8xtmr\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:08 crc kubenswrapper[4852]: I1210 12:15:08.631744 4852 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f07657eb-924e-43f3-b088-c851f8c62424-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:09 crc kubenswrapper[4852]: I1210 12:15:09.239557 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" event={"ID":"f07657eb-924e-43f3-b088-c851f8c62424","Type":"ContainerDied","Data":"ad5242c681f28d1fa49254d33a2b9247c8b42e16c8b54e351ccb4a09fe122796"} Dec 10 12:15:09 crc kubenswrapper[4852]: I1210 12:15:09.239607 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad5242c681f28d1fa49254d33a2b9247c8b42e16c8b54e351ccb4a09fe122796" Dec 10 12:15:09 crc kubenswrapper[4852]: I1210 12:15:09.240075 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4" Dec 10 12:15:11 crc kubenswrapper[4852]: I1210 12:15:11.264210 4852 generic.go:334] "Generic (PLEG): container finished" podID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerID="56491f2b1a48f2615c49d9453b549cc73bbc034128686e46e7baa1087db0f7dd" exitCode=0 Dec 10 12:15:11 crc kubenswrapper[4852]: I1210 12:15:11.264585 4852 generic.go:334] "Generic (PLEG): container finished" podID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerID="b8a7581a17970c2f6e522fdba50e5b73f77d9bb7ce15c738e0e03c9b4a806715" exitCode=0 Dec 10 12:15:11 crc kubenswrapper[4852]: I1210 12:15:11.264458 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerDied","Data":"56491f2b1a48f2615c49d9453b549cc73bbc034128686e46e7baa1087db0f7dd"} Dec 10 12:15:11 crc kubenswrapper[4852]: I1210 12:15:11.264635 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerDied","Data":"b8a7581a17970c2f6e522fdba50e5b73f77d9bb7ce15c738e0e03c9b4a806715"} Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.141644 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.283466 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c46da6f-a477-4e9e-99b0-c6911589d538","Type":"ContainerDied","Data":"a396f8b57694420c91aec2a8473467c22cd35f099080b0e5507550b4e15d53a5"} Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.283523 4852 scope.go:117] "RemoveContainer" containerID="fd22a16de6630b6271a2a55e315660a554c51fd64b9a509bccb0ee8b6b098acc" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.283564 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.293977 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-combined-ca-bundle\") pod \"8c46da6f-a477-4e9e-99b0-c6911589d538\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.294562 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzm77\" (UniqueName: \"kubernetes.io/projected/8c46da6f-a477-4e9e-99b0-c6911589d538-kube-api-access-pzm77\") pod \"8c46da6f-a477-4e9e-99b0-c6911589d538\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.294624 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-scripts\") pod \"8c46da6f-a477-4e9e-99b0-c6911589d538\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.294722 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-log-httpd\") pod \"8c46da6f-a477-4e9e-99b0-c6911589d538\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.294772 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-sg-core-conf-yaml\") pod \"8c46da6f-a477-4e9e-99b0-c6911589d538\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.294815 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-config-data\") pod \"8c46da6f-a477-4e9e-99b0-c6911589d538\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.294898 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-run-httpd\") pod \"8c46da6f-a477-4e9e-99b0-c6911589d538\" (UID: \"8c46da6f-a477-4e9e-99b0-c6911589d538\") " Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.295570 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c46da6f-a477-4e9e-99b0-c6911589d538" (UID: "8c46da6f-a477-4e9e-99b0-c6911589d538"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.296068 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c46da6f-a477-4e9e-99b0-c6911589d538" (UID: "8c46da6f-a477-4e9e-99b0-c6911589d538"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.302962 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c46da6f-a477-4e9e-99b0-c6911589d538-kube-api-access-pzm77" (OuterVolumeSpecName: "kube-api-access-pzm77") pod "8c46da6f-a477-4e9e-99b0-c6911589d538" (UID: "8c46da6f-a477-4e9e-99b0-c6911589d538"). InnerVolumeSpecName "kube-api-access-pzm77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.303401 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-scripts" (OuterVolumeSpecName: "scripts") pod "8c46da6f-a477-4e9e-99b0-c6911589d538" (UID: "8c46da6f-a477-4e9e-99b0-c6911589d538"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.307355 4852 scope.go:117] "RemoveContainer" containerID="dca04ce64993fba2a20f84f740c64b809e1b5349d37dc97410669c30456cb7d4" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.329968 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c46da6f-a477-4e9e-99b0-c6911589d538" (UID: "8c46da6f-a477-4e9e-99b0-c6911589d538"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.374693 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c46da6f-a477-4e9e-99b0-c6911589d538" (UID: "8c46da6f-a477-4e9e-99b0-c6911589d538"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.397375 4852 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.397429 4852 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.397468 4852 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c46da6f-a477-4e9e-99b0-c6911589d538-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.397477 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.397488 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzm77\" (UniqueName: \"kubernetes.io/projected/8c46da6f-a477-4e9e-99b0-c6911589d538-kube-api-access-pzm77\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.397496 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.408287 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-config-data" (OuterVolumeSpecName: "config-data") pod "8c46da6f-a477-4e9e-99b0-c6911589d538" (UID: "8c46da6f-a477-4e9e-99b0-c6911589d538"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.464413 4852 scope.go:117] "RemoveContainer" containerID="56491f2b1a48f2615c49d9453b549cc73bbc034128686e46e7baa1087db0f7dd" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.489907 4852 scope.go:117] "RemoveContainer" containerID="b8a7581a17970c2f6e522fdba50e5b73f77d9bb7ce15c738e0e03c9b4a806715" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.499569 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c46da6f-a477-4e9e-99b0-c6911589d538-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.624264 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.632945 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652081 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:12 crc kubenswrapper[4852]: E1210 12:15:12.652444 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="sg-core" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652460 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="sg-core" Dec 10 12:15:12 crc kubenswrapper[4852]: E1210 12:15:12.652474 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="ceilometer-central-agent" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652482 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="ceilometer-central-agent" Dec 10 12:15:12 crc kubenswrapper[4852]: E1210 12:15:12.652510 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="ceilometer-notification-agent" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652516 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="ceilometer-notification-agent" Dec 10 12:15:12 crc kubenswrapper[4852]: E1210 12:15:12.652526 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="proxy-httpd" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652532 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="proxy-httpd" Dec 10 12:15:12 crc kubenswrapper[4852]: E1210 12:15:12.652552 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07657eb-924e-43f3-b088-c851f8c62424" containerName="collect-profiles" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652559 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07657eb-924e-43f3-b088-c851f8c62424" containerName="collect-profiles" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652746 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="ceilometer-notification-agent" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652760 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="proxy-httpd" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652781 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07657eb-924e-43f3-b088-c851f8c62424" containerName="collect-profiles" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652796 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="ceilometer-central-agent" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.652809 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" containerName="sg-core" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.654330 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.657409 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.657170 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.673669 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.805608 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-log-httpd\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.805658 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-config-data\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.805680 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.805787 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-run-httpd\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.805808 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-scripts\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.805825 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvz8f\" (UniqueName: \"kubernetes.io/projected/fdad71ba-2d9b-476b-bda6-9d09183ec007-kube-api-access-jvz8f\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.805873 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.907526 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-config-data\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.907569 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.907685 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-run-httpd\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.908158 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-run-httpd\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.908202 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-scripts\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.908680 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvz8f\" (UniqueName: \"kubernetes.io/projected/fdad71ba-2d9b-476b-bda6-9d09183ec007-kube-api-access-jvz8f\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.909126 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.909308 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-log-httpd\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.909585 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-log-httpd\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.912308 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.912398 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-scripts\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.912721 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.913639 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-config-data\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.929323 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvz8f\" (UniqueName: \"kubernetes.io/projected/fdad71ba-2d9b-476b-bda6-9d09183ec007-kube-api-access-jvz8f\") pod \"ceilometer-0\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " pod="openstack/ceilometer-0" Dec 10 12:15:12 crc kubenswrapper[4852]: I1210 12:15:12.974136 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:15:13 crc kubenswrapper[4852]: I1210 12:15:13.422534 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:13 crc kubenswrapper[4852]: W1210 12:15:13.425600 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdad71ba_2d9b_476b_bda6_9d09183ec007.slice/crio-40806783c078a9abf6d546dafa11363b0402d1ce8ba4bf71668c1bffe2ed8fb5 WatchSource:0}: Error finding container 40806783c078a9abf6d546dafa11363b0402d1ce8ba4bf71668c1bffe2ed8fb5: Status 404 returned error can't find the container with id 40806783c078a9abf6d546dafa11363b0402d1ce8ba4bf71668c1bffe2ed8fb5 Dec 10 12:15:14 crc kubenswrapper[4852]: I1210 12:15:14.185235 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c46da6f-a477-4e9e-99b0-c6911589d538" path="/var/lib/kubelet/pods/8c46da6f-a477-4e9e-99b0-c6911589d538/volumes" Dec 10 12:15:14 crc kubenswrapper[4852]: I1210 12:15:14.345423 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerStarted","Data":"40806783c078a9abf6d546dafa11363b0402d1ce8ba4bf71668c1bffe2ed8fb5"} Dec 10 12:15:14 crc kubenswrapper[4852]: I1210 12:15:14.925466 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:15 crc kubenswrapper[4852]: I1210 12:15:15.356997 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerStarted","Data":"fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da"} Dec 10 12:15:15 crc kubenswrapper[4852]: I1210 12:15:15.357335 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerStarted","Data":"22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1"} Dec 10 12:15:16 crc kubenswrapper[4852]: I1210 12:15:16.370199 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerStarted","Data":"a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac"} Dec 10 12:15:18 crc kubenswrapper[4852]: I1210 12:15:18.389027 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerStarted","Data":"20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4"} Dec 10 12:15:19 crc kubenswrapper[4852]: I1210 12:15:19.398600 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="ceilometer-central-agent" containerID="cri-o://22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1" gracePeriod=30 Dec 10 12:15:19 crc kubenswrapper[4852]: I1210 12:15:19.398679 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="sg-core" containerID="cri-o://a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac" gracePeriod=30 Dec 10 12:15:19 crc kubenswrapper[4852]: I1210 12:15:19.398786 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="ceilometer-notification-agent" containerID="cri-o://fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da" gracePeriod=30 Dec 10 12:15:19 crc kubenswrapper[4852]: I1210 12:15:19.398892 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="proxy-httpd" containerID="cri-o://20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4" gracePeriod=30 Dec 10 12:15:19 crc kubenswrapper[4852]: I1210 12:15:19.399032 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 10 12:15:19 crc kubenswrapper[4852]: I1210 12:15:19.428807 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.223351456 podStartE2EDuration="7.428779267s" podCreationTimestamp="2025-12-10 12:15:12 +0000 UTC" firstStartedPulling="2025-12-10 12:15:13.427721039 +0000 UTC m=+1399.513246263" lastFinishedPulling="2025-12-10 12:15:17.63314885 +0000 UTC m=+1403.718674074" observedRunningTime="2025-12-10 12:15:19.419140726 +0000 UTC m=+1405.504665950" watchObservedRunningTime="2025-12-10 12:15:19.428779267 +0000 UTC m=+1405.514304491" Dec 10 12:15:20 crc kubenswrapper[4852]: I1210 12:15:20.409932 4852 generic.go:334] "Generic (PLEG): container finished" podID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerID="20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4" exitCode=0 Dec 10 12:15:20 crc kubenswrapper[4852]: I1210 12:15:20.409968 4852 generic.go:334] "Generic (PLEG): container finished" podID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerID="a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac" exitCode=2 Dec 10 12:15:20 crc kubenswrapper[4852]: I1210 12:15:20.409977 4852 generic.go:334] "Generic (PLEG): container finished" podID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerID="fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da" exitCode=0 Dec 10 12:15:20 crc kubenswrapper[4852]: I1210 12:15:20.409998 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerDied","Data":"20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4"} Dec 10 12:15:20 crc kubenswrapper[4852]: I1210 12:15:20.410023 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerDied","Data":"a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac"} Dec 10 12:15:20 crc kubenswrapper[4852]: I1210 12:15:20.410033 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerDied","Data":"fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da"} Dec 10 12:15:24 crc kubenswrapper[4852]: I1210 12:15:24.980105 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.141181 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-combined-ca-bundle\") pod \"fdad71ba-2d9b-476b-bda6-9d09183ec007\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.141291 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-run-httpd\") pod \"fdad71ba-2d9b-476b-bda6-9d09183ec007\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.141323 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-log-httpd\") pod \"fdad71ba-2d9b-476b-bda6-9d09183ec007\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.141458 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-scripts\") pod \"fdad71ba-2d9b-476b-bda6-9d09183ec007\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.141546 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvz8f\" (UniqueName: \"kubernetes.io/projected/fdad71ba-2d9b-476b-bda6-9d09183ec007-kube-api-access-jvz8f\") pod \"fdad71ba-2d9b-476b-bda6-9d09183ec007\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.142201 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-config-data\") pod \"fdad71ba-2d9b-476b-bda6-9d09183ec007\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.141917 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fdad71ba-2d9b-476b-bda6-9d09183ec007" (UID: "fdad71ba-2d9b-476b-bda6-9d09183ec007"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.142252 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-sg-core-conf-yaml\") pod \"fdad71ba-2d9b-476b-bda6-9d09183ec007\" (UID: \"fdad71ba-2d9b-476b-bda6-9d09183ec007\") " Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.142019 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fdad71ba-2d9b-476b-bda6-9d09183ec007" (UID: "fdad71ba-2d9b-476b-bda6-9d09183ec007"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.142537 4852 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.142557 4852 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fdad71ba-2d9b-476b-bda6-9d09183ec007-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.147854 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-scripts" (OuterVolumeSpecName: "scripts") pod "fdad71ba-2d9b-476b-bda6-9d09183ec007" (UID: "fdad71ba-2d9b-476b-bda6-9d09183ec007"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.148096 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdad71ba-2d9b-476b-bda6-9d09183ec007-kube-api-access-jvz8f" (OuterVolumeSpecName: "kube-api-access-jvz8f") pod "fdad71ba-2d9b-476b-bda6-9d09183ec007" (UID: "fdad71ba-2d9b-476b-bda6-9d09183ec007"). InnerVolumeSpecName "kube-api-access-jvz8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.176431 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fdad71ba-2d9b-476b-bda6-9d09183ec007" (UID: "fdad71ba-2d9b-476b-bda6-9d09183ec007"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.221863 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdad71ba-2d9b-476b-bda6-9d09183ec007" (UID: "fdad71ba-2d9b-476b-bda6-9d09183ec007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.244737 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvz8f\" (UniqueName: \"kubernetes.io/projected/fdad71ba-2d9b-476b-bda6-9d09183ec007-kube-api-access-jvz8f\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.244779 4852 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.244790 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.244799 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.252921 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-config-data" (OuterVolumeSpecName: "config-data") pod "fdad71ba-2d9b-476b-bda6-9d09183ec007" (UID: "fdad71ba-2d9b-476b-bda6-9d09183ec007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.346415 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdad71ba-2d9b-476b-bda6-9d09183ec007-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.458095 4852 generic.go:334] "Generic (PLEG): container finished" podID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerID="22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1" exitCode=0 Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.458137 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerDied","Data":"22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1"} Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.458169 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fdad71ba-2d9b-476b-bda6-9d09183ec007","Type":"ContainerDied","Data":"40806783c078a9abf6d546dafa11363b0402d1ce8ba4bf71668c1bffe2ed8fb5"} Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.458177 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.458185 4852 scope.go:117] "RemoveContainer" containerID="20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.482573 4852 scope.go:117] "RemoveContainer" containerID="a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.502769 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.523163 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.524005 4852 scope.go:117] "RemoveContainer" containerID="fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.539972 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:25 crc kubenswrapper[4852]: E1210 12:15:25.540646 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="proxy-httpd" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.540751 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="proxy-httpd" Dec 10 12:15:25 crc kubenswrapper[4852]: E1210 12:15:25.540825 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="ceilometer-central-agent" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.540881 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="ceilometer-central-agent" Dec 10 12:15:25 crc kubenswrapper[4852]: E1210 12:15:25.540958 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="ceilometer-notification-agent" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.541018 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="ceilometer-notification-agent" Dec 10 12:15:25 crc kubenswrapper[4852]: E1210 12:15:25.541078 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="sg-core" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.541132 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="sg-core" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.541376 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="sg-core" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.541452 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="ceilometer-central-agent" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.541514 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="proxy-httpd" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.541572 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" containerName="ceilometer-notification-agent" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.543317 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.550077 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.551411 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.564087 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.564707 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-run-httpd\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.564756 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.564822 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.564861 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-scripts\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.565020 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-config-data\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.565095 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh96h\" (UniqueName: \"kubernetes.io/projected/22d60e4a-73d6-4a31-a915-b336ce32d34f-kube-api-access-kh96h\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.565173 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-log-httpd\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.571816 4852 scope.go:117] "RemoveContainer" containerID="22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.613100 4852 scope.go:117] "RemoveContainer" containerID="20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4" Dec 10 12:15:25 crc kubenswrapper[4852]: E1210 12:15:25.613660 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4\": container with ID starting with 20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4 not found: ID does not exist" containerID="20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.613691 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4"} err="failed to get container status \"20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4\": rpc error: code = NotFound desc = could not find container \"20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4\": container with ID starting with 20335b37b9ed6d7d859a2a3654011ee5a335dfbf6a81b352b64aec0b3b1803c4 not found: ID does not exist" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.613717 4852 scope.go:117] "RemoveContainer" containerID="a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac" Dec 10 12:15:25 crc kubenswrapper[4852]: E1210 12:15:25.614000 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac\": container with ID starting with a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac not found: ID does not exist" containerID="a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.614023 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac"} err="failed to get container status \"a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac\": rpc error: code = NotFound desc = could not find container \"a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac\": container with ID starting with a70c0782f7c9c1b5f28cb65f74373beab679934d45cbc82af013ea482fbb6bac not found: ID does not exist" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.614042 4852 scope.go:117] "RemoveContainer" containerID="fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da" Dec 10 12:15:25 crc kubenswrapper[4852]: E1210 12:15:25.614396 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da\": container with ID starting with fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da not found: ID does not exist" containerID="fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.614446 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da"} err="failed to get container status \"fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da\": rpc error: code = NotFound desc = could not find container \"fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da\": container with ID starting with fc6d46a12aabb5152ac3de0892e675e4df877fbcce427ea1c184d107414bc9da not found: ID does not exist" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.614479 4852 scope.go:117] "RemoveContainer" containerID="22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1" Dec 10 12:15:25 crc kubenswrapper[4852]: E1210 12:15:25.614837 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1\": container with ID starting with 22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1 not found: ID does not exist" containerID="22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.614871 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1"} err="failed to get container status \"22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1\": rpc error: code = NotFound desc = could not find container \"22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1\": container with ID starting with 22566c014313fde3b83f750a9fd38ab4c94b43f3a458867e4994844bfa32ebc1 not found: ID does not exist" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.666586 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-run-httpd\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.666885 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.667059 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.667170 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-scripts\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.667733 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-config-data\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.667917 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh96h\" (UniqueName: \"kubernetes.io/projected/22d60e4a-73d6-4a31-a915-b336ce32d34f-kube-api-access-kh96h\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.668008 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-log-httpd\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.668395 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-log-httpd\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.667087 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-run-httpd\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.671315 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-scripts\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.671889 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.672427 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.673905 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-config-data\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.686605 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh96h\" (UniqueName: \"kubernetes.io/projected/22d60e4a-73d6-4a31-a915-b336ce32d34f-kube-api-access-kh96h\") pod \"ceilometer-0\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " pod="openstack/ceilometer-0" Dec 10 12:15:25 crc kubenswrapper[4852]: I1210 12:15:25.868961 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:15:26 crc kubenswrapper[4852]: I1210 12:15:26.129019 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:15:26 crc kubenswrapper[4852]: I1210 12:15:26.186602 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdad71ba-2d9b-476b-bda6-9d09183ec007" path="/var/lib/kubelet/pods/fdad71ba-2d9b-476b-bda6-9d09183ec007/volumes" Dec 10 12:15:26 crc kubenswrapper[4852]: I1210 12:15:26.469718 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerStarted","Data":"4ae547d73f11ec0cec4f940a1651be7a5090e494905d1a44df8b2e59a82bf293"} Dec 10 12:15:27 crc kubenswrapper[4852]: I1210 12:15:27.480279 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerStarted","Data":"d40dd3fc0e1816f8acd2a31c4fee5f68835ed642fcb98f672584276c1e167de2"} Dec 10 12:15:29 crc kubenswrapper[4852]: I1210 12:15:29.499578 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerStarted","Data":"f420eacbf4fc57ab16572d037b25d04c4bad0375c0a7c419cc1e238afe9cd3b6"} Dec 10 12:15:32 crc kubenswrapper[4852]: I1210 12:15:32.530906 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerStarted","Data":"7a871a26d3ae77a77a248583bbbe2d55f912bf2d0b45a0a2d07190bcc3fbab1b"} Dec 10 12:15:33 crc kubenswrapper[4852]: I1210 12:15:33.542585 4852 generic.go:334] "Generic (PLEG): container finished" podID="f59c1e79-8117-43c8-bd38-0f1f144271a5" containerID="e7f127cfc23767857e99bf4d219bdc65b7ff545da4bc7a33f028fff5ba930b28" exitCode=0 Dec 10 12:15:33 crc kubenswrapper[4852]: I1210 12:15:33.542682 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" event={"ID":"f59c1e79-8117-43c8-bd38-0f1f144271a5","Type":"ContainerDied","Data":"e7f127cfc23767857e99bf4d219bdc65b7ff545da4bc7a33f028fff5ba930b28"} Dec 10 12:15:34 crc kubenswrapper[4852]: I1210 12:15:34.910761 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.068796 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-config-data\") pod \"f59c1e79-8117-43c8-bd38-0f1f144271a5\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.068876 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-combined-ca-bundle\") pod \"f59c1e79-8117-43c8-bd38-0f1f144271a5\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.068983 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmh26\" (UniqueName: \"kubernetes.io/projected/f59c1e79-8117-43c8-bd38-0f1f144271a5-kube-api-access-mmh26\") pod \"f59c1e79-8117-43c8-bd38-0f1f144271a5\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.069072 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-scripts\") pod \"f59c1e79-8117-43c8-bd38-0f1f144271a5\" (UID: \"f59c1e79-8117-43c8-bd38-0f1f144271a5\") " Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.074124 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-scripts" (OuterVolumeSpecName: "scripts") pod "f59c1e79-8117-43c8-bd38-0f1f144271a5" (UID: "f59c1e79-8117-43c8-bd38-0f1f144271a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.087735 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59c1e79-8117-43c8-bd38-0f1f144271a5-kube-api-access-mmh26" (OuterVolumeSpecName: "kube-api-access-mmh26") pod "f59c1e79-8117-43c8-bd38-0f1f144271a5" (UID: "f59c1e79-8117-43c8-bd38-0f1f144271a5"). InnerVolumeSpecName "kube-api-access-mmh26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.112359 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-config-data" (OuterVolumeSpecName: "config-data") pod "f59c1e79-8117-43c8-bd38-0f1f144271a5" (UID: "f59c1e79-8117-43c8-bd38-0f1f144271a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.113978 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f59c1e79-8117-43c8-bd38-0f1f144271a5" (UID: "f59c1e79-8117-43c8-bd38-0f1f144271a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.170760 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmh26\" (UniqueName: \"kubernetes.io/projected/f59c1e79-8117-43c8-bd38-0f1f144271a5-kube-api-access-mmh26\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.170795 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.170808 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.170820 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c1e79-8117-43c8-bd38-0f1f144271a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.569444 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" event={"ID":"f59c1e79-8117-43c8-bd38-0f1f144271a5","Type":"ContainerDied","Data":"a3ac7e769d3de62c661e65ec7b628ef75bdd3c1d9cac6f27c65c71487a1f99d0"} Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.569739 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ac7e769d3de62c661e65ec7b628ef75bdd3c1d9cac6f27c65c71487a1f99d0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.569518 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zb7h6" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.679039 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 10 12:15:35 crc kubenswrapper[4852]: E1210 12:15:35.679562 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59c1e79-8117-43c8-bd38-0f1f144271a5" containerName="nova-cell0-conductor-db-sync" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.679586 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59c1e79-8117-43c8-bd38-0f1f144271a5" containerName="nova-cell0-conductor-db-sync" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.679816 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f59c1e79-8117-43c8-bd38-0f1f144271a5" containerName="nova-cell0-conductor-db-sync" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.680592 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.684902 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-45f4l" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.685766 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.694790 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.781343 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.781451 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clqcb\" (UniqueName: \"kubernetes.io/projected/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-kube-api-access-clqcb\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.781565 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.883510 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.883604 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clqcb\" (UniqueName: \"kubernetes.io/projected/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-kube-api-access-clqcb\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.883687 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.888139 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.891851 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.905856 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clqcb\" (UniqueName: \"kubernetes.io/projected/70c6e25c-cf76-4ec0-9981-5a8dbc98d07e-kube-api-access-clqcb\") pod \"nova-cell0-conductor-0\" (UID: \"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e\") " pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:35 crc kubenswrapper[4852]: I1210 12:15:35.995798 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:36 crc kubenswrapper[4852]: W1210 12:15:36.443419 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c6e25c_cf76_4ec0_9981_5a8dbc98d07e.slice/crio-88e53fef5e3697f6b1348312c2d3ae5341c195dbd462e4d2853d0797c69d2b4d WatchSource:0}: Error finding container 88e53fef5e3697f6b1348312c2d3ae5341c195dbd462e4d2853d0797c69d2b4d: Status 404 returned error can't find the container with id 88e53fef5e3697f6b1348312c2d3ae5341c195dbd462e4d2853d0797c69d2b4d Dec 10 12:15:36 crc kubenswrapper[4852]: I1210 12:15:36.444376 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 10 12:15:36 crc kubenswrapper[4852]: I1210 12:15:36.579865 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e","Type":"ContainerStarted","Data":"88e53fef5e3697f6b1348312c2d3ae5341c195dbd462e4d2853d0797c69d2b4d"} Dec 10 12:15:36 crc kubenswrapper[4852]: I1210 12:15:36.582039 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerStarted","Data":"923dcee73f8bea88d93ebec76d72a03a548982e682d04deb32a2b83bc3729039"} Dec 10 12:15:36 crc kubenswrapper[4852]: I1210 12:15:36.583287 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 10 12:15:36 crc kubenswrapper[4852]: I1210 12:15:36.607521 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.286173649 podStartE2EDuration="11.60750222s" podCreationTimestamp="2025-12-10 12:15:25 +0000 UTC" firstStartedPulling="2025-12-10 12:15:26.138283357 +0000 UTC m=+1412.223808581" lastFinishedPulling="2025-12-10 12:15:35.459611928 +0000 UTC m=+1421.545137152" observedRunningTime="2025-12-10 12:15:36.606864344 +0000 UTC m=+1422.692389568" watchObservedRunningTime="2025-12-10 12:15:36.60750222 +0000 UTC m=+1422.693027444" Dec 10 12:15:37 crc kubenswrapper[4852]: I1210 12:15:37.595078 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"70c6e25c-cf76-4ec0-9981-5a8dbc98d07e","Type":"ContainerStarted","Data":"7041de462f35e5e39d9205115790a9c8a9d8535c63c0909fa64846e63def9eee"} Dec 10 12:15:37 crc kubenswrapper[4852]: I1210 12:15:37.614643 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.614625855 podStartE2EDuration="2.614625855s" podCreationTimestamp="2025-12-10 12:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:15:37.612141413 +0000 UTC m=+1423.697666657" watchObservedRunningTime="2025-12-10 12:15:37.614625855 +0000 UTC m=+1423.700151079" Dec 10 12:15:38 crc kubenswrapper[4852]: I1210 12:15:38.604192 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:45 crc kubenswrapper[4852]: I1210 12:15:45.790169 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:15:45 crc kubenswrapper[4852]: I1210 12:15:45.790554 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.029805 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.504994 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-svlcr"] Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.509458 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.513428 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.514101 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.516602 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-svlcr"] Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.683401 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.688145 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.692328 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.701867 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-config-data\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.701959 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g694\" (UniqueName: \"kubernetes.io/projected/71c50ff1-4991-48d8-9b54-f49f711fffb6-kube-api-access-5g694\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.702009 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-scripts\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.702041 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.719972 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.811088 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-config-data\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.811163 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4nnd\" (UniqueName: \"kubernetes.io/projected/d9ae3fdf-8e16-4763-8f50-389a52334458-kube-api-access-k4nnd\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.811197 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-config-data\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.811300 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.811378 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g694\" (UniqueName: \"kubernetes.io/projected/71c50ff1-4991-48d8-9b54-f49f711fffb6-kube-api-access-5g694\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.811453 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-scripts\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.811502 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.822089 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-scripts\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.834432 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-config-data\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.835063 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.839889 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.841420 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.847431 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.875150 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g694\" (UniqueName: \"kubernetes.io/projected/71c50ff1-4991-48d8-9b54-f49f711fffb6-kube-api-access-5g694\") pod \"nova-cell0-cell-mapping-svlcr\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.917390 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.919082 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f3043a-1163-4556-bcd1-9fa1087d61f1-logs\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.919155 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4nnd\" (UniqueName: \"kubernetes.io/projected/d9ae3fdf-8e16-4763-8f50-389a52334458-kube-api-access-k4nnd\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.919179 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-config-data\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.919208 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.919263 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.919299 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wl8v\" (UniqueName: \"kubernetes.io/projected/f8f3043a-1163-4556-bcd1-9fa1087d61f1-kube-api-access-4wl8v\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.919317 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-config-data\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.939412 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-config-data\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.939485 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.941377 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.946014 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.955825 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.963746 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4nnd\" (UniqueName: \"kubernetes.io/projected/d9ae3fdf-8e16-4763-8f50-389a52334458-kube-api-access-k4nnd\") pod \"nova-scheduler-0\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " pod="openstack/nova-scheduler-0" Dec 10 12:15:46 crc kubenswrapper[4852]: I1210 12:15:46.993736 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.006301 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.007972 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.012569 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.020996 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.021063 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wl8v\" (UniqueName: \"kubernetes.io/projected/f8f3043a-1163-4556-bcd1-9fa1087d61f1-kube-api-access-4wl8v\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.021092 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-config-data\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.021127 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f3043a-1163-4556-bcd1-9fa1087d61f1-logs\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.022199 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f3043a-1163-4556-bcd1-9fa1087d61f1-logs\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.030788 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.048627 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-config-data\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.049646 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.067359 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.069671 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wl8v\" (UniqueName: \"kubernetes.io/projected/f8f3043a-1163-4556-bcd1-9fa1087d61f1-kube-api-access-4wl8v\") pod \"nova-metadata-0\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.097016 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.099180 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-x7855"] Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.102444 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.107454 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-x7855"] Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.124150 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.124215 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-config-data\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.128363 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f56v\" (UniqueName: \"kubernetes.io/projected/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-kube-api-access-8f56v\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.128532 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfgv\" (UniqueName: \"kubernetes.io/projected/67a4146b-61bc-4522-8247-43bbfa3dfed7-kube-api-access-tsfgv\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.128704 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.128870 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67a4146b-61bc-4522-8247-43bbfa3dfed7-logs\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.128945 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.146966 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231334 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2rnx\" (UniqueName: \"kubernetes.io/projected/e127297c-061e-457a-9c2a-5794a1f39a3a-kube-api-access-g2rnx\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231781 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231807 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-config-data\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231832 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f56v\" (UniqueName: \"kubernetes.io/projected/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-kube-api-access-8f56v\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231850 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfgv\" (UniqueName: \"kubernetes.io/projected/67a4146b-61bc-4522-8247-43bbfa3dfed7-kube-api-access-tsfgv\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231875 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231919 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-svc\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231939 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.231956 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.232002 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.232026 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67a4146b-61bc-4522-8247-43bbfa3dfed7-logs\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.232050 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-config\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.232072 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.237187 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67a4146b-61bc-4522-8247-43bbfa3dfed7-logs\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.255337 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.256756 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.260316 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-config-data\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.267654 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfgv\" (UniqueName: \"kubernetes.io/projected/67a4146b-61bc-4522-8247-43bbfa3dfed7-kube-api-access-tsfgv\") pod \"nova-api-0\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.267535 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.269887 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f56v\" (UniqueName: \"kubernetes.io/projected/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-kube-api-access-8f56v\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.335279 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.335391 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-config\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.335513 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2rnx\" (UniqueName: \"kubernetes.io/projected/e127297c-061e-457a-9c2a-5794a1f39a3a-kube-api-access-g2rnx\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.335683 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.335790 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-svc\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.335830 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.336720 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-config\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.336871 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.337056 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.337557 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-svc\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.338163 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.363289 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2rnx\" (UniqueName: \"kubernetes.io/projected/e127297c-061e-457a-9c2a-5794a1f39a3a-kube-api-access-g2rnx\") pod \"dnsmasq-dns-bccf8f775-x7855\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.432843 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.454676 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.466757 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.692398 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b68mm"] Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.694180 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.698475 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.702290 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.728055 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b68mm"] Dec 10 12:15:47 crc kubenswrapper[4852]: W1210 12:15:47.734350 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8f3043a_1163_4556_bcd1_9fa1087d61f1.slice/crio-b97b0de3716fa7229e4669e1545206da72b9fe45518a390623e04ade6e54b546 WatchSource:0}: Error finding container b97b0de3716fa7229e4669e1545206da72b9fe45518a390623e04ade6e54b546: Status 404 returned error can't find the container with id b97b0de3716fa7229e4669e1545206da72b9fe45518a390623e04ade6e54b546 Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.736176 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.744626 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.749335 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.749467 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-config-data\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.749509 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8tgn\" (UniqueName: \"kubernetes.io/projected/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-kube-api-access-h8tgn\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.750207 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-scripts\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.852503 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-scripts\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.852702 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.852954 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-config-data\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.852984 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8tgn\" (UniqueName: \"kubernetes.io/projected/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-kube-api-access-h8tgn\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.861837 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.862406 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-scripts\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.863124 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-config-data\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.873444 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8tgn\" (UniqueName: \"kubernetes.io/projected/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-kube-api-access-h8tgn\") pod \"nova-cell1-conductor-db-sync-b68mm\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:47 crc kubenswrapper[4852]: I1210 12:15:47.922984 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-svlcr"] Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.031572 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.124102 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.230645 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-x7855"] Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.249972 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.569985 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b68mm"] Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.719183 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67a4146b-61bc-4522-8247-43bbfa3dfed7","Type":"ContainerStarted","Data":"21787076010daa9e8a8efb09e7271ec17c7f9f4c5dff1076479bba46af674513"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.724109 4852 generic.go:334] "Generic (PLEG): container finished" podID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerID="aa7fd083008d9cb0d131aea641f785353c7942b80c415c6ae5dfd44dc153b228" exitCode=0 Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.724185 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-x7855" event={"ID":"e127297c-061e-457a-9c2a-5794a1f39a3a","Type":"ContainerDied","Data":"aa7fd083008d9cb0d131aea641f785353c7942b80c415c6ae5dfd44dc153b228"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.724215 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-x7855" event={"ID":"e127297c-061e-457a-9c2a-5794a1f39a3a","Type":"ContainerStarted","Data":"c19d68174beb38f04cfb0a72e72a7613ea3d967ed23a81449cddeaf6fc84970f"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.740101 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9ae3fdf-8e16-4763-8f50-389a52334458","Type":"ContainerStarted","Data":"f8b5c8edd64e4a18c82b54867c2f8da111eeb5455a14e1bce52bc2a7bbcc4685"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.752252 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-svlcr" event={"ID":"71c50ff1-4991-48d8-9b54-f49f711fffb6","Type":"ContainerStarted","Data":"a1137ef0204eadd0d77af87557be669f16776973d46f520e5a48094514ced9b2"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.752301 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-svlcr" event={"ID":"71c50ff1-4991-48d8-9b54-f49f711fffb6","Type":"ContainerStarted","Data":"c1e580ac31e4405f308b9f9ecc795ebbe8daf3e17198b0f26dff14b33faee942"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.755866 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8f3043a-1163-4556-bcd1-9fa1087d61f1","Type":"ContainerStarted","Data":"b97b0de3716fa7229e4669e1545206da72b9fe45518a390623e04ade6e54b546"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.763859 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b68mm" event={"ID":"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902","Type":"ContainerStarted","Data":"a6f0fe96f3752f8e139775e3c8c69fe2e3b90e3ba8522a3aa0be5b1355068702"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.767156 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2","Type":"ContainerStarted","Data":"f76878bc782ca6d28c9f76042206a6f02b134a9a41842ebde51c5af6f49bbfa4"} Dec 10 12:15:48 crc kubenswrapper[4852]: I1210 12:15:48.801450 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-svlcr" podStartSLOduration=2.80142679 podStartE2EDuration="2.80142679s" podCreationTimestamp="2025-12-10 12:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:15:48.775767439 +0000 UTC m=+1434.861292673" watchObservedRunningTime="2025-12-10 12:15:48.80142679 +0000 UTC m=+1434.886952014" Dec 10 12:15:49 crc kubenswrapper[4852]: I1210 12:15:49.777267 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b68mm" event={"ID":"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902","Type":"ContainerStarted","Data":"66e553cae7591ba1c5daa51fce62c0e4f906c9fa7ead5729b146abf1dc87472d"} Dec 10 12:15:49 crc kubenswrapper[4852]: I1210 12:15:49.782393 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-x7855" event={"ID":"e127297c-061e-457a-9c2a-5794a1f39a3a","Type":"ContainerStarted","Data":"1e65996dcd79af814e189fefb8a535c89ac0f12b73ef2b900b3f8f137a86bb29"} Dec 10 12:15:49 crc kubenswrapper[4852]: I1210 12:15:49.782542 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:49 crc kubenswrapper[4852]: I1210 12:15:49.807307 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-b68mm" podStartSLOduration=2.8072825630000002 podStartE2EDuration="2.807282563s" podCreationTimestamp="2025-12-10 12:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:15:49.794994706 +0000 UTC m=+1435.880519940" watchObservedRunningTime="2025-12-10 12:15:49.807282563 +0000 UTC m=+1435.892807787" Dec 10 12:15:49 crc kubenswrapper[4852]: I1210 12:15:49.830868 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-x7855" podStartSLOduration=3.8308434719999997 podStartE2EDuration="3.830843472s" podCreationTimestamp="2025-12-10 12:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:15:49.816732109 +0000 UTC m=+1435.902257353" watchObservedRunningTime="2025-12-10 12:15:49.830843472 +0000 UTC m=+1435.916368706" Dec 10 12:15:50 crc kubenswrapper[4852]: I1210 12:15:50.212931 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:15:50 crc kubenswrapper[4852]: I1210 12:15:50.213183 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.820543 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2","Type":"ContainerStarted","Data":"c58b23ccaf518a2f72729195b40d6bca8271c7eacb17384f74613307f5af9fe1"} Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.820660 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c58b23ccaf518a2f72729195b40d6bca8271c7eacb17384f74613307f5af9fe1" gracePeriod=30 Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.823736 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67a4146b-61bc-4522-8247-43bbfa3dfed7","Type":"ContainerStarted","Data":"2479c986a5ad88aa043914613b01ceb1d3cd197822b3893de0fad5ade1954b00"} Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.823775 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67a4146b-61bc-4522-8247-43bbfa3dfed7","Type":"ContainerStarted","Data":"878cf13b30dcfc5c838fceac47cbdd1471638fa40f5c5b1636bf4335f7b73eb0"} Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.826486 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9ae3fdf-8e16-4763-8f50-389a52334458","Type":"ContainerStarted","Data":"a31e30447c33edd0f5b2a6b9fa43e78b73b023631da38727694eca441ecbf644"} Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.829698 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8f3043a-1163-4556-bcd1-9fa1087d61f1","Type":"ContainerStarted","Data":"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b"} Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.829741 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8f3043a-1163-4556-bcd1-9fa1087d61f1","Type":"ContainerStarted","Data":"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a"} Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.829856 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerName="nova-metadata-metadata" containerID="cri-o://e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b" gracePeriod=30 Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.829823 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerName="nova-metadata-log" containerID="cri-o://34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a" gracePeriod=30 Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.849354 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.531764429 podStartE2EDuration="6.849331395s" podCreationTimestamp="2025-12-10 12:15:46 +0000 UTC" firstStartedPulling="2025-12-10 12:15:48.230444402 +0000 UTC m=+1434.315969626" lastFinishedPulling="2025-12-10 12:15:51.548011368 +0000 UTC m=+1437.633536592" observedRunningTime="2025-12-10 12:15:52.838882964 +0000 UTC m=+1438.924408188" watchObservedRunningTime="2025-12-10 12:15:52.849331395 +0000 UTC m=+1438.934856629" Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.862965 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.068327739 podStartE2EDuration="6.862944685s" podCreationTimestamp="2025-12-10 12:15:46 +0000 UTC" firstStartedPulling="2025-12-10 12:15:47.751822363 +0000 UTC m=+1433.837347587" lastFinishedPulling="2025-12-10 12:15:51.546439309 +0000 UTC m=+1437.631964533" observedRunningTime="2025-12-10 12:15:52.856074363 +0000 UTC m=+1438.941599597" watchObservedRunningTime="2025-12-10 12:15:52.862944685 +0000 UTC m=+1438.948469929" Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.881123 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.462839357 podStartE2EDuration="6.881102479s" podCreationTimestamp="2025-12-10 12:15:46 +0000 UTC" firstStartedPulling="2025-12-10 12:15:48.140020833 +0000 UTC m=+1434.225546067" lastFinishedPulling="2025-12-10 12:15:51.558283975 +0000 UTC m=+1437.643809189" observedRunningTime="2025-12-10 12:15:52.874449552 +0000 UTC m=+1438.959974786" watchObservedRunningTime="2025-12-10 12:15:52.881102479 +0000 UTC m=+1438.966627713" Dec 10 12:15:52 crc kubenswrapper[4852]: I1210 12:15:52.899217 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.093487868 podStartE2EDuration="6.899191791s" podCreationTimestamp="2025-12-10 12:15:46 +0000 UTC" firstStartedPulling="2025-12-10 12:15:47.742366907 +0000 UTC m=+1433.827892131" lastFinishedPulling="2025-12-10 12:15:51.54807082 +0000 UTC m=+1437.633596054" observedRunningTime="2025-12-10 12:15:52.891088508 +0000 UTC m=+1438.976613752" watchObservedRunningTime="2025-12-10 12:15:52.899191791 +0000 UTC m=+1438.984717035" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.454732 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.548793 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-config-data\") pod \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.548843 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-combined-ca-bundle\") pod \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.549031 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f3043a-1163-4556-bcd1-9fa1087d61f1-logs\") pod \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.549587 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wl8v\" (UniqueName: \"kubernetes.io/projected/f8f3043a-1163-4556-bcd1-9fa1087d61f1-kube-api-access-4wl8v\") pod \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\" (UID: \"f8f3043a-1163-4556-bcd1-9fa1087d61f1\") " Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.553315 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f3043a-1163-4556-bcd1-9fa1087d61f1-logs" (OuterVolumeSpecName: "logs") pod "f8f3043a-1163-4556-bcd1-9fa1087d61f1" (UID: "f8f3043a-1163-4556-bcd1-9fa1087d61f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.557124 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f3043a-1163-4556-bcd1-9fa1087d61f1-kube-api-access-4wl8v" (OuterVolumeSpecName: "kube-api-access-4wl8v") pod "f8f3043a-1163-4556-bcd1-9fa1087d61f1" (UID: "f8f3043a-1163-4556-bcd1-9fa1087d61f1"). InnerVolumeSpecName "kube-api-access-4wl8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.591115 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-config-data" (OuterVolumeSpecName: "config-data") pod "f8f3043a-1163-4556-bcd1-9fa1087d61f1" (UID: "f8f3043a-1163-4556-bcd1-9fa1087d61f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.592840 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8f3043a-1163-4556-bcd1-9fa1087d61f1" (UID: "f8f3043a-1163-4556-bcd1-9fa1087d61f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.652703 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.652740 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f3043a-1163-4556-bcd1-9fa1087d61f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.652754 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8f3043a-1163-4556-bcd1-9fa1087d61f1-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.652766 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wl8v\" (UniqueName: \"kubernetes.io/projected/f8f3043a-1163-4556-bcd1-9fa1087d61f1-kube-api-access-4wl8v\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.841800 4852 generic.go:334] "Generic (PLEG): container finished" podID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerID="e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b" exitCode=0 Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.841834 4852 generic.go:334] "Generic (PLEG): container finished" podID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerID="34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a" exitCode=143 Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.841838 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.841909 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8f3043a-1163-4556-bcd1-9fa1087d61f1","Type":"ContainerDied","Data":"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b"} Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.841967 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8f3043a-1163-4556-bcd1-9fa1087d61f1","Type":"ContainerDied","Data":"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a"} Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.841983 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8f3043a-1163-4556-bcd1-9fa1087d61f1","Type":"ContainerDied","Data":"b97b0de3716fa7229e4669e1545206da72b9fe45518a390623e04ade6e54b546"} Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.842005 4852 scope.go:117] "RemoveContainer" containerID="e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.873517 4852 scope.go:117] "RemoveContainer" containerID="34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.879782 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.894976 4852 scope.go:117] "RemoveContainer" containerID="e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b" Dec 10 12:15:53 crc kubenswrapper[4852]: E1210 12:15:53.896698 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b\": container with ID starting with e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b not found: ID does not exist" containerID="e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.896745 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b"} err="failed to get container status \"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b\": rpc error: code = NotFound desc = could not find container \"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b\": container with ID starting with e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b not found: ID does not exist" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.896775 4852 scope.go:117] "RemoveContainer" containerID="34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.899317 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:53 crc kubenswrapper[4852]: E1210 12:15:53.899628 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a\": container with ID starting with 34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a not found: ID does not exist" containerID="34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.899654 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a"} err="failed to get container status \"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a\": rpc error: code = NotFound desc = could not find container \"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a\": container with ID starting with 34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a not found: ID does not exist" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.899674 4852 scope.go:117] "RemoveContainer" containerID="e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.900150 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b"} err="failed to get container status \"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b\": rpc error: code = NotFound desc = could not find container \"e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b\": container with ID starting with e8359d5729f56afb5d7a788d21b8181bf9897ad2faac4bc84440202be069159b not found: ID does not exist" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.900204 4852 scope.go:117] "RemoveContainer" containerID="34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.900523 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a"} err="failed to get container status \"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a\": rpc error: code = NotFound desc = could not find container \"34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a\": container with ID starting with 34255a6200582dc2e71323c7a271f9be2d3f8ee2146c23766141730006fa234a not found: ID does not exist" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.908852 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:53 crc kubenswrapper[4852]: E1210 12:15:53.909400 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerName="nova-metadata-log" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.909424 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerName="nova-metadata-log" Dec 10 12:15:53 crc kubenswrapper[4852]: E1210 12:15:53.909441 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerName="nova-metadata-metadata" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.909449 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerName="nova-metadata-metadata" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.909696 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerName="nova-metadata-metadata" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.909727 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" containerName="nova-metadata-log" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.911051 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.916669 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.916717 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 10 12:15:53 crc kubenswrapper[4852]: I1210 12:15:53.919132 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.059729 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr5ns\" (UniqueName: \"kubernetes.io/projected/fc12ecc4-9a32-4c62-9509-798e0f6784a5-kube-api-access-fr5ns\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.060188 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.060260 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.060301 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-config-data\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.060344 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc12ecc4-9a32-4c62-9509-798e0f6784a5-logs\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.162345 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.162437 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.162492 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-config-data\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.162542 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc12ecc4-9a32-4c62-9509-798e0f6784a5-logs\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.162631 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr5ns\" (UniqueName: \"kubernetes.io/projected/fc12ecc4-9a32-4c62-9509-798e0f6784a5-kube-api-access-fr5ns\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.163084 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc12ecc4-9a32-4c62-9509-798e0f6784a5-logs\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.165182 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.165181 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.166526 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.187269 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-config-data\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.190692 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr5ns\" (UniqueName: \"kubernetes.io/projected/fc12ecc4-9a32-4c62-9509-798e0f6784a5-kube-api-access-fr5ns\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.191839 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.193287 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f3043a-1163-4556-bcd1-9fa1087d61f1" path="/var/lib/kubelet/pods/f8f3043a-1163-4556-bcd1-9fa1087d61f1/volumes" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.234460 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.694703 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:54 crc kubenswrapper[4852]: I1210 12:15:54.859769 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc12ecc4-9a32-4c62-9509-798e0f6784a5","Type":"ContainerStarted","Data":"3b7fac7e25a3591d8b9e18759d8dbee381e2e83decb1024ccefdf3686954c8d0"} Dec 10 12:15:55 crc kubenswrapper[4852]: I1210 12:15:55.872623 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc12ecc4-9a32-4c62-9509-798e0f6784a5","Type":"ContainerStarted","Data":"6de144ae826874068a80869fb720cd775b1ff94ea940ae31fa46ff2b73124366"} Dec 10 12:15:55 crc kubenswrapper[4852]: I1210 12:15:55.872920 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc12ecc4-9a32-4c62-9509-798e0f6784a5","Type":"ContainerStarted","Data":"223fcf80ddc788967b643b71d2e171d53ef87db912e4ef2e32d508020ee33fbe"} Dec 10 12:15:55 crc kubenswrapper[4852]: I1210 12:15:55.874414 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 10 12:15:55 crc kubenswrapper[4852]: I1210 12:15:55.902054 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9020372229999998 podStartE2EDuration="2.902037223s" podCreationTimestamp="2025-12-10 12:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:15:55.89634178 +0000 UTC m=+1441.981867004" watchObservedRunningTime="2025-12-10 12:15:55.902037223 +0000 UTC m=+1441.987562447" Dec 10 12:15:56 crc kubenswrapper[4852]: I1210 12:15:56.885267 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-svlcr" event={"ID":"71c50ff1-4991-48d8-9b54-f49f711fffb6","Type":"ContainerDied","Data":"a1137ef0204eadd0d77af87557be669f16776973d46f520e5a48094514ced9b2"} Dec 10 12:15:56 crc kubenswrapper[4852]: I1210 12:15:56.885216 4852 generic.go:334] "Generic (PLEG): container finished" podID="71c50ff1-4991-48d8-9b54-f49f711fffb6" containerID="a1137ef0204eadd0d77af87557be669f16776973d46f520e5a48094514ced9b2" exitCode=0 Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.032344 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.032723 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.062068 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.433744 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.433812 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.456380 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.469412 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.545543 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dh4tm"] Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.545762 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" podUID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerName="dnsmasq-dns" containerID="cri-o://f4ba41145f05a7545fb638ab4bbc41aa422573ad2488530bf8b1c115bdfae972" gracePeriod=10 Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.764716 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" podUID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: connect: connection refused" Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.899366 4852 generic.go:334] "Generic (PLEG): container finished" podID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerID="f4ba41145f05a7545fb638ab4bbc41aa422573ad2488530bf8b1c115bdfae972" exitCode=0 Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.899442 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" event={"ID":"6792d3ff-5d80-410e-98c4-57dc79836a58","Type":"ContainerDied","Data":"f4ba41145f05a7545fb638ab4bbc41aa422573ad2488530bf8b1c115bdfae972"} Dec 10 12:15:57 crc kubenswrapper[4852]: I1210 12:15:57.957764 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.298944 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.356474 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-config-data\") pod \"71c50ff1-4991-48d8-9b54-f49f711fffb6\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.356554 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-combined-ca-bundle\") pod \"71c50ff1-4991-48d8-9b54-f49f711fffb6\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.356689 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-scripts\") pod \"71c50ff1-4991-48d8-9b54-f49f711fffb6\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.356739 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g694\" (UniqueName: \"kubernetes.io/projected/71c50ff1-4991-48d8-9b54-f49f711fffb6-kube-api-access-5g694\") pod \"71c50ff1-4991-48d8-9b54-f49f711fffb6\" (UID: \"71c50ff1-4991-48d8-9b54-f49f711fffb6\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.365636 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-scripts" (OuterVolumeSpecName: "scripts") pod "71c50ff1-4991-48d8-9b54-f49f711fffb6" (UID: "71c50ff1-4991-48d8-9b54-f49f711fffb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.375393 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c50ff1-4991-48d8-9b54-f49f711fffb6-kube-api-access-5g694" (OuterVolumeSpecName: "kube-api-access-5g694") pod "71c50ff1-4991-48d8-9b54-f49f711fffb6" (UID: "71c50ff1-4991-48d8-9b54-f49f711fffb6"). InnerVolumeSpecName "kube-api-access-5g694". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.393428 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71c50ff1-4991-48d8-9b54-f49f711fffb6" (UID: "71c50ff1-4991-48d8-9b54-f49f711fffb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.398991 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-config-data" (OuterVolumeSpecName: "config-data") pod "71c50ff1-4991-48d8-9b54-f49f711fffb6" (UID: "71c50ff1-4991-48d8-9b54-f49f711fffb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.459575 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.459621 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g694\" (UniqueName: \"kubernetes.io/projected/71c50ff1-4991-48d8-9b54-f49f711fffb6-kube-api-access-5g694\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.459636 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.459648 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c50ff1-4991-48d8-9b54-f49f711fffb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.517479 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.518212 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.559307 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.662264 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5dz4\" (UniqueName: \"kubernetes.io/projected/6792d3ff-5d80-410e-98c4-57dc79836a58-kube-api-access-g5dz4\") pod \"6792d3ff-5d80-410e-98c4-57dc79836a58\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.662317 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-sb\") pod \"6792d3ff-5d80-410e-98c4-57dc79836a58\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.662380 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-svc\") pod \"6792d3ff-5d80-410e-98c4-57dc79836a58\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.662460 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-swift-storage-0\") pod \"6792d3ff-5d80-410e-98c4-57dc79836a58\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.662503 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-config\") pod \"6792d3ff-5d80-410e-98c4-57dc79836a58\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.662619 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-nb\") pod \"6792d3ff-5d80-410e-98c4-57dc79836a58\" (UID: \"6792d3ff-5d80-410e-98c4-57dc79836a58\") " Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.669860 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6792d3ff-5d80-410e-98c4-57dc79836a58-kube-api-access-g5dz4" (OuterVolumeSpecName: "kube-api-access-g5dz4") pod "6792d3ff-5d80-410e-98c4-57dc79836a58" (UID: "6792d3ff-5d80-410e-98c4-57dc79836a58"). InnerVolumeSpecName "kube-api-access-g5dz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.729396 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6792d3ff-5d80-410e-98c4-57dc79836a58" (UID: "6792d3ff-5d80-410e-98c4-57dc79836a58"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.730311 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6792d3ff-5d80-410e-98c4-57dc79836a58" (UID: "6792d3ff-5d80-410e-98c4-57dc79836a58"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.745879 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6792d3ff-5d80-410e-98c4-57dc79836a58" (UID: "6792d3ff-5d80-410e-98c4-57dc79836a58"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.748050 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6792d3ff-5d80-410e-98c4-57dc79836a58" (UID: "6792d3ff-5d80-410e-98c4-57dc79836a58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.759628 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-config" (OuterVolumeSpecName: "config") pod "6792d3ff-5d80-410e-98c4-57dc79836a58" (UID: "6792d3ff-5d80-410e-98c4-57dc79836a58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.767790 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.767830 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.767842 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5dz4\" (UniqueName: \"kubernetes.io/projected/6792d3ff-5d80-410e-98c4-57dc79836a58-kube-api-access-g5dz4\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.767855 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.767865 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.767878 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792d3ff-5d80-410e-98c4-57dc79836a58-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.909856 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" event={"ID":"6792d3ff-5d80-410e-98c4-57dc79836a58","Type":"ContainerDied","Data":"89d931cfd44a4b6eb8f561dd996763015caf3d24f05642e6606c0ddd3db8a439"} Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.909905 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-dh4tm" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.909924 4852 scope.go:117] "RemoveContainer" containerID="f4ba41145f05a7545fb638ab4bbc41aa422573ad2488530bf8b1c115bdfae972" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.912027 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-svlcr" event={"ID":"71c50ff1-4991-48d8-9b54-f49f711fffb6","Type":"ContainerDied","Data":"c1e580ac31e4405f308b9f9ecc795ebbe8daf3e17198b0f26dff14b33faee942"} Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.912075 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1e580ac31e4405f308b9f9ecc795ebbe8daf3e17198b0f26dff14b33faee942" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.912045 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-svlcr" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.942646 4852 scope.go:117] "RemoveContainer" containerID="b206a9b1d279bf59e81117c94f11c7f407bae80d59916a83bea99e87a869fc46" Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.960446 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dh4tm"] Dec 10 12:15:58 crc kubenswrapper[4852]: I1210 12:15:58.971302 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dh4tm"] Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.216860 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.217128 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-log" containerID="cri-o://878cf13b30dcfc5c838fceac47cbdd1471638fa40f5c5b1636bf4335f7b73eb0" gracePeriod=30 Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.217722 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-api" containerID="cri-o://2479c986a5ad88aa043914613b01ceb1d3cd197822b3893de0fad5ade1954b00" gracePeriod=30 Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.229655 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.235501 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.236385 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.239591 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.925223 4852 generic.go:334] "Generic (PLEG): container finished" podID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerID="878cf13b30dcfc5c838fceac47cbdd1471638fa40f5c5b1636bf4335f7b73eb0" exitCode=143 Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.925565 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67a4146b-61bc-4522-8247-43bbfa3dfed7","Type":"ContainerDied","Data":"878cf13b30dcfc5c838fceac47cbdd1471638fa40f5c5b1636bf4335f7b73eb0"} Dec 10 12:15:59 crc kubenswrapper[4852]: I1210 12:15:59.926823 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d9ae3fdf-8e16-4763-8f50-389a52334458" containerName="nova-scheduler-scheduler" containerID="cri-o://a31e30447c33edd0f5b2a6b9fa43e78b73b023631da38727694eca441ecbf644" gracePeriod=30 Dec 10 12:16:00 crc kubenswrapper[4852]: I1210 12:16:00.184410 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6792d3ff-5d80-410e-98c4-57dc79836a58" path="/var/lib/kubelet/pods/6792d3ff-5d80-410e-98c4-57dc79836a58/volumes" Dec 10 12:16:00 crc kubenswrapper[4852]: I1210 12:16:00.204852 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:16:00 crc kubenswrapper[4852]: I1210 12:16:00.205088 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="681949fa-a426-400e-8f81-475a0555dc08" containerName="kube-state-metrics" containerID="cri-o://ba91414b2b8cececfe2dd537993001c8ce1311c25ee723642eaaa47f1e5eed51" gracePeriod=30 Dec 10 12:16:00 crc kubenswrapper[4852]: I1210 12:16:00.934606 4852 generic.go:334] "Generic (PLEG): container finished" podID="681949fa-a426-400e-8f81-475a0555dc08" containerID="ba91414b2b8cececfe2dd537993001c8ce1311c25ee723642eaaa47f1e5eed51" exitCode=2 Dec 10 12:16:00 crc kubenswrapper[4852]: I1210 12:16:00.934776 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerName="nova-metadata-log" containerID="cri-o://223fcf80ddc788967b643b71d2e171d53ef87db912e4ef2e32d508020ee33fbe" gracePeriod=30 Dec 10 12:16:00 crc kubenswrapper[4852]: I1210 12:16:00.935033 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"681949fa-a426-400e-8f81-475a0555dc08","Type":"ContainerDied","Data":"ba91414b2b8cececfe2dd537993001c8ce1311c25ee723642eaaa47f1e5eed51"} Dec 10 12:16:00 crc kubenswrapper[4852]: I1210 12:16:00.935329 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerName="nova-metadata-metadata" containerID="cri-o://6de144ae826874068a80869fb720cd775b1ff94ea940ae31fa46ff2b73124366" gracePeriod=30 Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.877493 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.922434 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctdwr\" (UniqueName: \"kubernetes.io/projected/681949fa-a426-400e-8f81-475a0555dc08-kube-api-access-ctdwr\") pod \"681949fa-a426-400e-8f81-475a0555dc08\" (UID: \"681949fa-a426-400e-8f81-475a0555dc08\") " Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.927671 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/681949fa-a426-400e-8f81-475a0555dc08-kube-api-access-ctdwr" (OuterVolumeSpecName: "kube-api-access-ctdwr") pod "681949fa-a426-400e-8f81-475a0555dc08" (UID: "681949fa-a426-400e-8f81-475a0555dc08"). InnerVolumeSpecName "kube-api-access-ctdwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.953466 4852 generic.go:334] "Generic (PLEG): container finished" podID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerID="6de144ae826874068a80869fb720cd775b1ff94ea940ae31fa46ff2b73124366" exitCode=0 Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.953501 4852 generic.go:334] "Generic (PLEG): container finished" podID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerID="223fcf80ddc788967b643b71d2e171d53ef87db912e4ef2e32d508020ee33fbe" exitCode=143 Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.953545 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc12ecc4-9a32-4c62-9509-798e0f6784a5","Type":"ContainerDied","Data":"6de144ae826874068a80869fb720cd775b1ff94ea940ae31fa46ff2b73124366"} Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.953587 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc12ecc4-9a32-4c62-9509-798e0f6784a5","Type":"ContainerDied","Data":"223fcf80ddc788967b643b71d2e171d53ef87db912e4ef2e32d508020ee33fbe"} Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.955477 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"681949fa-a426-400e-8f81-475a0555dc08","Type":"ContainerDied","Data":"19a752e59d7fd623214bbccbb079dfb6e784e5866b40fc5f8f47d72b1cae5e56"} Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.955594 4852 scope.go:117] "RemoveContainer" containerID="ba91414b2b8cececfe2dd537993001c8ce1311c25ee723642eaaa47f1e5eed51" Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.955512 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 10 12:16:01 crc kubenswrapper[4852]: I1210 12:16:01.995102 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.006347 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.015538 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:16:02 crc kubenswrapper[4852]: E1210 12:16:02.015888 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerName="init" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.015903 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerName="init" Dec 10 12:16:02 crc kubenswrapper[4852]: E1210 12:16:02.015919 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="681949fa-a426-400e-8f81-475a0555dc08" containerName="kube-state-metrics" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.015924 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="681949fa-a426-400e-8f81-475a0555dc08" containerName="kube-state-metrics" Dec 10 12:16:02 crc kubenswrapper[4852]: E1210 12:16:02.015943 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerName="dnsmasq-dns" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.015949 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerName="dnsmasq-dns" Dec 10 12:16:02 crc kubenswrapper[4852]: E1210 12:16:02.015967 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c50ff1-4991-48d8-9b54-f49f711fffb6" containerName="nova-manage" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.015973 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c50ff1-4991-48d8-9b54-f49f711fffb6" containerName="nova-manage" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.016134 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="6792d3ff-5d80-410e-98c4-57dc79836a58" containerName="dnsmasq-dns" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.016147 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="681949fa-a426-400e-8f81-475a0555dc08" containerName="kube-state-metrics" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.016164 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c50ff1-4991-48d8-9b54-f49f711fffb6" containerName="nova-manage" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.016789 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.018598 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.025140 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctdwr\" (UniqueName: \"kubernetes.io/projected/681949fa-a426-400e-8f81-475a0555dc08-kube-api-access-ctdwr\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.026055 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.026902 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:16:02 crc kubenswrapper[4852]: E1210 12:16:02.034073 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e30447c33edd0f5b2a6b9fa43e78b73b023631da38727694eca441ecbf644" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 10 12:16:02 crc kubenswrapper[4852]: E1210 12:16:02.035738 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e30447c33edd0f5b2a6b9fa43e78b73b023631da38727694eca441ecbf644" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 10 12:16:02 crc kubenswrapper[4852]: E1210 12:16:02.038202 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e30447c33edd0f5b2a6b9fa43e78b73b023631da38727694eca441ecbf644" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 10 12:16:02 crc kubenswrapper[4852]: E1210 12:16:02.038303 4852 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d9ae3fdf-8e16-4763-8f50-389a52334458" containerName="nova-scheduler-scheduler" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.121539 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.121969 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="proxy-httpd" containerID="cri-o://923dcee73f8bea88d93ebec76d72a03a548982e682d04deb32a2b83bc3729039" gracePeriod=30 Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.121998 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="ceilometer-notification-agent" containerID="cri-o://f420eacbf4fc57ab16572d037b25d04c4bad0375c0a7c419cc1e238afe9cd3b6" gracePeriod=30 Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.121861 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="ceilometer-central-agent" containerID="cri-o://d40dd3fc0e1816f8acd2a31c4fee5f68835ed642fcb98f672584276c1e167de2" gracePeriod=30 Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.122640 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="sg-core" containerID="cri-o://7a871a26d3ae77a77a248583bbbe2d55f912bf2d0b45a0a2d07190bcc3fbab1b" gracePeriod=30 Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.127193 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.127668 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.127916 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckfz\" (UniqueName: \"kubernetes.io/projected/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-api-access-zckfz\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.128032 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.180336 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="681949fa-a426-400e-8f81-475a0555dc08" path="/var/lib/kubelet/pods/681949fa-a426-400e-8f81-475a0555dc08/volumes" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.226854 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5n5vm"] Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.234023 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.242090 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.242364 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.242528 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zckfz\" (UniqueName: \"kubernetes.io/projected/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-api-access-zckfz\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.242556 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.270753 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.274901 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.278778 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.283146 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckfz\" (UniqueName: \"kubernetes.io/projected/20797400-1dd7-4c4b-af50-9f0c839a06c6-kube-api-access-zckfz\") pod \"kube-state-metrics-0\" (UID: \"20797400-1dd7-4c4b-af50-9f0c839a06c6\") " pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.288488 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5n5vm"] Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.334132 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.345074 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-catalog-content\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.345150 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzs59\" (UniqueName: \"kubernetes.io/projected/effe5adf-90b0-4a00-a67a-589d4f355203-kube-api-access-lzs59\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.345357 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-utilities\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.447634 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-catalog-content\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.448393 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzs59\" (UniqueName: \"kubernetes.io/projected/effe5adf-90b0-4a00-a67a-589d4f355203-kube-api-access-lzs59\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.448553 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-catalog-content\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.448561 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-utilities\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.449352 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-utilities\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.478968 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzs59\" (UniqueName: \"kubernetes.io/projected/effe5adf-90b0-4a00-a67a-589d4f355203-kube-api-access-lzs59\") pod \"redhat-operators-5n5vm\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.659907 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.868340 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.968300 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"20797400-1dd7-4c4b-af50-9f0c839a06c6","Type":"ContainerStarted","Data":"c350a16e432101c8cc9e76f42993cdacf0cfec718977d0148d3d536eaae23189"} Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.972397 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc12ecc4-9a32-4c62-9509-798e0f6784a5","Type":"ContainerDied","Data":"3b7fac7e25a3591d8b9e18759d8dbee381e2e83decb1024ccefdf3686954c8d0"} Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.972631 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b7fac7e25a3591d8b9e18759d8dbee381e2e83decb1024ccefdf3686954c8d0" Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.975683 4852 generic.go:334] "Generic (PLEG): container finished" podID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerID="923dcee73f8bea88d93ebec76d72a03a548982e682d04deb32a2b83bc3729039" exitCode=0 Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.975838 4852 generic.go:334] "Generic (PLEG): container finished" podID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerID="7a871a26d3ae77a77a248583bbbe2d55f912bf2d0b45a0a2d07190bcc3fbab1b" exitCode=2 Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.975922 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerDied","Data":"923dcee73f8bea88d93ebec76d72a03a548982e682d04deb32a2b83bc3729039"} Dec 10 12:16:02 crc kubenswrapper[4852]: I1210 12:16:02.976004 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerDied","Data":"7a871a26d3ae77a77a248583bbbe2d55f912bf2d0b45a0a2d07190bcc3fbab1b"} Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.016532 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.061091 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-config-data\") pod \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.061130 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc12ecc4-9a32-4c62-9509-798e0f6784a5-logs\") pod \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.061156 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-combined-ca-bundle\") pod \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.061270 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-nova-metadata-tls-certs\") pod \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.061302 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr5ns\" (UniqueName: \"kubernetes.io/projected/fc12ecc4-9a32-4c62-9509-798e0f6784a5-kube-api-access-fr5ns\") pod \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\" (UID: \"fc12ecc4-9a32-4c62-9509-798e0f6784a5\") " Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.061481 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc12ecc4-9a32-4c62-9509-798e0f6784a5-logs" (OuterVolumeSpecName: "logs") pod "fc12ecc4-9a32-4c62-9509-798e0f6784a5" (UID: "fc12ecc4-9a32-4c62-9509-798e0f6784a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.061717 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc12ecc4-9a32-4c62-9509-798e0f6784a5-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.071337 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc12ecc4-9a32-4c62-9509-798e0f6784a5-kube-api-access-fr5ns" (OuterVolumeSpecName: "kube-api-access-fr5ns") pod "fc12ecc4-9a32-4c62-9509-798e0f6784a5" (UID: "fc12ecc4-9a32-4c62-9509-798e0f6784a5"). InnerVolumeSpecName "kube-api-access-fr5ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.098825 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-config-data" (OuterVolumeSpecName: "config-data") pod "fc12ecc4-9a32-4c62-9509-798e0f6784a5" (UID: "fc12ecc4-9a32-4c62-9509-798e0f6784a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.105837 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc12ecc4-9a32-4c62-9509-798e0f6784a5" (UID: "fc12ecc4-9a32-4c62-9509-798e0f6784a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.144902 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fc12ecc4-9a32-4c62-9509-798e0f6784a5" (UID: "fc12ecc4-9a32-4c62-9509-798e0f6784a5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.163045 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.163089 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.163103 4852 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc12ecc4-9a32-4c62-9509-798e0f6784a5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.163115 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr5ns\" (UniqueName: \"kubernetes.io/projected/fc12ecc4-9a32-4c62-9509-798e0f6784a5-kube-api-access-fr5ns\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.234735 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5n5vm"] Dec 10 12:16:03 crc kubenswrapper[4852]: W1210 12:16:03.243439 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeffe5adf_90b0_4a00_a67a_589d4f355203.slice/crio-e123e4576f4b383cb5d0c619f8d7d4de390f015a2f185a0ed7e39b024a470cc1 WatchSource:0}: Error finding container e123e4576f4b383cb5d0c619f8d7d4de390f015a2f185a0ed7e39b024a470cc1: Status 404 returned error can't find the container with id e123e4576f4b383cb5d0c619f8d7d4de390f015a2f185a0ed7e39b024a470cc1 Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.985129 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n5vm" event={"ID":"effe5adf-90b0-4a00-a67a-589d4f355203","Type":"ContainerStarted","Data":"b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9"} Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.985460 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n5vm" event={"ID":"effe5adf-90b0-4a00-a67a-589d4f355203","Type":"ContainerStarted","Data":"e123e4576f4b383cb5d0c619f8d7d4de390f015a2f185a0ed7e39b024a470cc1"} Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.989166 4852 generic.go:334] "Generic (PLEG): container finished" podID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerID="d40dd3fc0e1816f8acd2a31c4fee5f68835ed642fcb98f672584276c1e167de2" exitCode=0 Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.989278 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:16:03 crc kubenswrapper[4852]: I1210 12:16:03.989281 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerDied","Data":"d40dd3fc0e1816f8acd2a31c4fee5f68835ed642fcb98f672584276c1e167de2"} Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.093475 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.113528 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.129595 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:04 crc kubenswrapper[4852]: E1210 12:16:04.130106 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerName="nova-metadata-metadata" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.130130 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerName="nova-metadata-metadata" Dec 10 12:16:04 crc kubenswrapper[4852]: E1210 12:16:04.130165 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerName="nova-metadata-log" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.130176 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerName="nova-metadata-log" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.130443 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerName="nova-metadata-log" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.130483 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" containerName="nova-metadata-metadata" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.131761 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.135086 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.135449 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.139277 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.180874 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c899c38-c8c0-4524-9bb5-ec72cd80c806-logs\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.180934 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.181020 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/4c899c38-c8c0-4524-9bb5-ec72cd80c806-kube-api-access-226hp\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.181104 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-config-data\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.181750 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.181817 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc12ecc4-9a32-4c62-9509-798e0f6784a5" path="/var/lib/kubelet/pods/fc12ecc4-9a32-4c62-9509-798e0f6784a5/volumes" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.283479 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.283854 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c899c38-c8c0-4524-9bb5-ec72cd80c806-logs\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.283958 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.284052 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/4c899c38-c8c0-4524-9bb5-ec72cd80c806-kube-api-access-226hp\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.284134 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-config-data\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.284492 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c899c38-c8c0-4524-9bb5-ec72cd80c806-logs\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.289758 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.289859 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-config-data\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.299013 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.302177 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/4c899c38-c8c0-4524-9bb5-ec72cd80c806-kube-api-access-226hp\") pod \"nova-metadata-0\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.453143 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:16:04 crc kubenswrapper[4852]: I1210 12:16:04.930708 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:04 crc kubenswrapper[4852]: W1210 12:16:04.943327 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c899c38_c8c0_4524_9bb5_ec72cd80c806.slice/crio-1ec4d1b1d7b13810a79b1dbbfb9868cf36662344568c062880a17c7d92c3bd3b WatchSource:0}: Error finding container 1ec4d1b1d7b13810a79b1dbbfb9868cf36662344568c062880a17c7d92c3bd3b: Status 404 returned error can't find the container with id 1ec4d1b1d7b13810a79b1dbbfb9868cf36662344568c062880a17c7d92c3bd3b Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.017348 4852 generic.go:334] "Generic (PLEG): container finished" podID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerID="2479c986a5ad88aa043914613b01ceb1d3cd197822b3893de0fad5ade1954b00" exitCode=0 Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.017484 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67a4146b-61bc-4522-8247-43bbfa3dfed7","Type":"ContainerDied","Data":"2479c986a5ad88aa043914613b01ceb1d3cd197822b3893de0fad5ade1954b00"} Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.040945 4852 generic.go:334] "Generic (PLEG): container finished" podID="d9ae3fdf-8e16-4763-8f50-389a52334458" containerID="a31e30447c33edd0f5b2a6b9fa43e78b73b023631da38727694eca441ecbf644" exitCode=0 Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.041078 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9ae3fdf-8e16-4763-8f50-389a52334458","Type":"ContainerDied","Data":"a31e30447c33edd0f5b2a6b9fa43e78b73b023631da38727694eca441ecbf644"} Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.043451 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c899c38-c8c0-4524-9bb5-ec72cd80c806","Type":"ContainerStarted","Data":"1ec4d1b1d7b13810a79b1dbbfb9868cf36662344568c062880a17c7d92c3bd3b"} Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.047038 4852 generic.go:334] "Generic (PLEG): container finished" podID="effe5adf-90b0-4a00-a67a-589d4f355203" containerID="b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9" exitCode=0 Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.047092 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n5vm" event={"ID":"effe5adf-90b0-4a00-a67a-589d4f355203","Type":"ContainerDied","Data":"b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9"} Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.219642 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.405955 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-config-data\") pod \"d9ae3fdf-8e16-4763-8f50-389a52334458\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.406027 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4nnd\" (UniqueName: \"kubernetes.io/projected/d9ae3fdf-8e16-4763-8f50-389a52334458-kube-api-access-k4nnd\") pod \"d9ae3fdf-8e16-4763-8f50-389a52334458\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.406204 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-combined-ca-bundle\") pod \"d9ae3fdf-8e16-4763-8f50-389a52334458\" (UID: \"d9ae3fdf-8e16-4763-8f50-389a52334458\") " Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.412533 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ae3fdf-8e16-4763-8f50-389a52334458-kube-api-access-k4nnd" (OuterVolumeSpecName: "kube-api-access-k4nnd") pod "d9ae3fdf-8e16-4763-8f50-389a52334458" (UID: "d9ae3fdf-8e16-4763-8f50-389a52334458"). InnerVolumeSpecName "kube-api-access-k4nnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.436000 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-config-data" (OuterVolumeSpecName: "config-data") pod "d9ae3fdf-8e16-4763-8f50-389a52334458" (UID: "d9ae3fdf-8e16-4763-8f50-389a52334458"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.437261 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9ae3fdf-8e16-4763-8f50-389a52334458" (UID: "d9ae3fdf-8e16-4763-8f50-389a52334458"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.508451 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4nnd\" (UniqueName: \"kubernetes.io/projected/d9ae3fdf-8e16-4763-8f50-389a52334458-kube-api-access-k4nnd\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.508483 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.508493 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae3fdf-8e16-4763-8f50-389a52334458-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.641604 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.813901 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67a4146b-61bc-4522-8247-43bbfa3dfed7-logs\") pod \"67a4146b-61bc-4522-8247-43bbfa3dfed7\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.814440 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a4146b-61bc-4522-8247-43bbfa3dfed7-logs" (OuterVolumeSpecName: "logs") pod "67a4146b-61bc-4522-8247-43bbfa3dfed7" (UID: "67a4146b-61bc-4522-8247-43bbfa3dfed7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.814462 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-config-data\") pod \"67a4146b-61bc-4522-8247-43bbfa3dfed7\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.814508 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsfgv\" (UniqueName: \"kubernetes.io/projected/67a4146b-61bc-4522-8247-43bbfa3dfed7-kube-api-access-tsfgv\") pod \"67a4146b-61bc-4522-8247-43bbfa3dfed7\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.814659 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-combined-ca-bundle\") pod \"67a4146b-61bc-4522-8247-43bbfa3dfed7\" (UID: \"67a4146b-61bc-4522-8247-43bbfa3dfed7\") " Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.815469 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67a4146b-61bc-4522-8247-43bbfa3dfed7-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.818652 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a4146b-61bc-4522-8247-43bbfa3dfed7-kube-api-access-tsfgv" (OuterVolumeSpecName: "kube-api-access-tsfgv") pod "67a4146b-61bc-4522-8247-43bbfa3dfed7" (UID: "67a4146b-61bc-4522-8247-43bbfa3dfed7"). InnerVolumeSpecName "kube-api-access-tsfgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.840213 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-config-data" (OuterVolumeSpecName: "config-data") pod "67a4146b-61bc-4522-8247-43bbfa3dfed7" (UID: "67a4146b-61bc-4522-8247-43bbfa3dfed7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.846308 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67a4146b-61bc-4522-8247-43bbfa3dfed7" (UID: "67a4146b-61bc-4522-8247-43bbfa3dfed7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.917099 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.917134 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsfgv\" (UniqueName: \"kubernetes.io/projected/67a4146b-61bc-4522-8247-43bbfa3dfed7-kube-api-access-tsfgv\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:05 crc kubenswrapper[4852]: I1210 12:16:05.917145 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a4146b-61bc-4522-8247-43bbfa3dfed7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.078151 4852 generic.go:334] "Generic (PLEG): container finished" podID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerID="f420eacbf4fc57ab16572d037b25d04c4bad0375c0a7c419cc1e238afe9cd3b6" exitCode=0 Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.078272 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerDied","Data":"f420eacbf4fc57ab16572d037b25d04c4bad0375c0a7c419cc1e238afe9cd3b6"} Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.081544 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9ae3fdf-8e16-4763-8f50-389a52334458","Type":"ContainerDied","Data":"f8b5c8edd64e4a18c82b54867c2f8da111eeb5455a14e1bce52bc2a7bbcc4685"} Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.081591 4852 scope.go:117] "RemoveContainer" containerID="a31e30447c33edd0f5b2a6b9fa43e78b73b023631da38727694eca441ecbf644" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.081728 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.084579 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c899c38-c8c0-4524-9bb5-ec72cd80c806","Type":"ContainerStarted","Data":"a307ddae9c258eee8dbef34b1d4e8b51e77a37d9486301302423efda57f555c9"} Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.084626 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c899c38-c8c0-4524-9bb5-ec72cd80c806","Type":"ContainerStarted","Data":"efbaa538454e2216e36b1c7df26827ccedf45f70ea51517fdad3aad2e8dc6529"} Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.089515 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"20797400-1dd7-4c4b-af50-9f0c839a06c6","Type":"ContainerStarted","Data":"c14c76aa129f2ebfef2131f4717548971cb8b3c915f5fd5548a7e116c7fe7e5b"} Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.089692 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.095031 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n5vm" event={"ID":"effe5adf-90b0-4a00-a67a-589d4f355203","Type":"ContainerStarted","Data":"ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded"} Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.098020 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67a4146b-61bc-4522-8247-43bbfa3dfed7","Type":"ContainerDied","Data":"21787076010daa9e8a8efb09e7271ec17c7f9f4c5dff1076479bba46af674513"} Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.098110 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.124146 4852 scope.go:117] "RemoveContainer" containerID="2479c986a5ad88aa043914613b01ceb1d3cd197822b3893de0fad5ade1954b00" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.128808 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.128790338 podStartE2EDuration="2.128790338s" podCreationTimestamp="2025-12-10 12:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:06.108103871 +0000 UTC m=+1452.193629095" watchObservedRunningTime="2025-12-10 12:16:06.128790338 +0000 UTC m=+1452.214315572" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.154761 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.318090395 podStartE2EDuration="5.154740937s" podCreationTimestamp="2025-12-10 12:16:01 +0000 UTC" firstStartedPulling="2025-12-10 12:16:02.87557119 +0000 UTC m=+1448.961096414" lastFinishedPulling="2025-12-10 12:16:04.712221732 +0000 UTC m=+1450.797746956" observedRunningTime="2025-12-10 12:16:06.133247529 +0000 UTC m=+1452.218772753" watchObservedRunningTime="2025-12-10 12:16:06.154740937 +0000 UTC m=+1452.240266161" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.165015 4852 scope.go:117] "RemoveContainer" containerID="878cf13b30dcfc5c838fceac47cbdd1471638fa40f5c5b1636bf4335f7b73eb0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.235540 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.259627 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.272363 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:06 crc kubenswrapper[4852]: E1210 12:16:06.272913 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-log" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.272985 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-log" Dec 10 12:16:06 crc kubenswrapper[4852]: E1210 12:16:06.273010 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-api" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.273018 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-api" Dec 10 12:16:06 crc kubenswrapper[4852]: E1210 12:16:06.273041 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ae3fdf-8e16-4763-8f50-389a52334458" containerName="nova-scheduler-scheduler" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.273049 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ae3fdf-8e16-4763-8f50-389a52334458" containerName="nova-scheduler-scheduler" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.273268 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-api" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.273289 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" containerName="nova-api-log" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.273316 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ae3fdf-8e16-4763-8f50-389a52334458" containerName="nova-scheduler-scheduler" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.274753 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.278527 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.285884 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.302574 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.305875 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.311686 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.320753 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:06 crc kubenswrapper[4852]: E1210 12:16:06.322102 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="ceilometer-central-agent" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.322129 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="ceilometer-central-agent" Dec 10 12:16:06 crc kubenswrapper[4852]: E1210 12:16:06.322150 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="proxy-httpd" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.322159 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="proxy-httpd" Dec 10 12:16:06 crc kubenswrapper[4852]: E1210 12:16:06.322182 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="ceilometer-notification-agent" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.322190 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="ceilometer-notification-agent" Dec 10 12:16:06 crc kubenswrapper[4852]: E1210 12:16:06.322222 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="sg-core" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.322244 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="sg-core" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.322487 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="ceilometer-notification-agent" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.322517 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="sg-core" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.322528 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="proxy-httpd" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.322544 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" containerName="ceilometer-central-agent" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.323719 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.330754 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.331638 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh96h\" (UniqueName: \"kubernetes.io/projected/22d60e4a-73d6-4a31-a915-b336ce32d34f-kube-api-access-kh96h\") pod \"22d60e4a-73d6-4a31-a915-b336ce32d34f\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.331672 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-run-httpd\") pod \"22d60e4a-73d6-4a31-a915-b336ce32d34f\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.331723 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-config-data\") pod \"22d60e4a-73d6-4a31-a915-b336ce32d34f\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.331746 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-combined-ca-bundle\") pod \"22d60e4a-73d6-4a31-a915-b336ce32d34f\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.331781 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-log-httpd\") pod \"22d60e4a-73d6-4a31-a915-b336ce32d34f\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.331812 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-sg-core-conf-yaml\") pod \"22d60e4a-73d6-4a31-a915-b336ce32d34f\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.331860 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-scripts\") pod \"22d60e4a-73d6-4a31-a915-b336ce32d34f\" (UID: \"22d60e4a-73d6-4a31-a915-b336ce32d34f\") " Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.332063 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pn25\" (UniqueName: \"kubernetes.io/projected/e24574d5-d688-42bc-b424-0fca36afa981-kube-api-access-6pn25\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.332166 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-config-data\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.332211 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-logs\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.332279 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.332307 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6lds\" (UniqueName: \"kubernetes.io/projected/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-kube-api-access-k6lds\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.334242 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "22d60e4a-73d6-4a31-a915-b336ce32d34f" (UID: "22d60e4a-73d6-4a31-a915-b336ce32d34f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.341797 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "22d60e4a-73d6-4a31-a915-b336ce32d34f" (UID: "22d60e4a-73d6-4a31-a915-b336ce32d34f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.341959 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-config-data\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.342141 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.342280 4852 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.342292 4852 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22d60e4a-73d6-4a31-a915-b336ce32d34f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.346389 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.347417 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-scripts" (OuterVolumeSpecName: "scripts") pod "22d60e4a-73d6-4a31-a915-b336ce32d34f" (UID: "22d60e4a-73d6-4a31-a915-b336ce32d34f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.370565 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d60e4a-73d6-4a31-a915-b336ce32d34f-kube-api-access-kh96h" (OuterVolumeSpecName: "kube-api-access-kh96h") pod "22d60e4a-73d6-4a31-a915-b336ce32d34f" (UID: "22d60e4a-73d6-4a31-a915-b336ce32d34f"). InnerVolumeSpecName "kube-api-access-kh96h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.395607 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "22d60e4a-73d6-4a31-a915-b336ce32d34f" (UID: "22d60e4a-73d6-4a31-a915-b336ce32d34f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444652 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pn25\" (UniqueName: \"kubernetes.io/projected/e24574d5-d688-42bc-b424-0fca36afa981-kube-api-access-6pn25\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444739 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-config-data\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444776 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-logs\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444808 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444826 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6lds\" (UniqueName: \"kubernetes.io/projected/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-kube-api-access-k6lds\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444841 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-config-data\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444891 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444944 4852 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444956 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.444965 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh96h\" (UniqueName: \"kubernetes.io/projected/22d60e4a-73d6-4a31-a915-b336ce32d34f-kube-api-access-kh96h\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.445212 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-logs\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.448635 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-config-data\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.450871 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.450927 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-config-data\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.451046 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22d60e4a-73d6-4a31-a915-b336ce32d34f" (UID: "22d60e4a-73d6-4a31-a915-b336ce32d34f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.456715 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.460321 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6lds\" (UniqueName: \"kubernetes.io/projected/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-kube-api-access-k6lds\") pod \"nova-api-0\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " pod="openstack/nova-api-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.460353 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pn25\" (UniqueName: \"kubernetes.io/projected/e24574d5-d688-42bc-b424-0fca36afa981-kube-api-access-6pn25\") pod \"nova-scheduler-0\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.515612 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-config-data" (OuterVolumeSpecName: "config-data") pod "22d60e4a-73d6-4a31-a915-b336ce32d34f" (UID: "22d60e4a-73d6-4a31-a915-b336ce32d34f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.547663 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.547710 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22d60e4a-73d6-4a31-a915-b336ce32d34f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.622425 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:16:06 crc kubenswrapper[4852]: I1210 12:16:06.661816 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.091419 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.117911 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.117904 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22d60e4a-73d6-4a31-a915-b336ce32d34f","Type":"ContainerDied","Data":"4ae547d73f11ec0cec4f940a1651be7a5090e494905d1a44df8b2e59a82bf293"} Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.118472 4852 scope.go:117] "RemoveContainer" containerID="923dcee73f8bea88d93ebec76d72a03a548982e682d04deb32a2b83bc3729039" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.122718 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e24574d5-d688-42bc-b424-0fca36afa981","Type":"ContainerStarted","Data":"7d3ceda2a3e5e7011daff810a2b9cb524cfd9c5249f58c528cfff1a93319668e"} Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.124908 4852 generic.go:334] "Generic (PLEG): container finished" podID="effe5adf-90b0-4a00-a67a-589d4f355203" containerID="ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded" exitCode=0 Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.124950 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n5vm" event={"ID":"effe5adf-90b0-4a00-a67a-589d4f355203","Type":"ContainerDied","Data":"ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded"} Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.245562 4852 scope.go:117] "RemoveContainer" containerID="7a871a26d3ae77a77a248583bbbe2d55f912bf2d0b45a0a2d07190bcc3fbab1b" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.252857 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.294286 4852 scope.go:117] "RemoveContainer" containerID="f420eacbf4fc57ab16572d037b25d04c4bad0375c0a7c419cc1e238afe9cd3b6" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.310147 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.331766 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.335178 4852 scope.go:117] "RemoveContainer" containerID="d40dd3fc0e1816f8acd2a31c4fee5f68835ed642fcb98f672584276c1e167de2" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.354210 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.357173 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.360089 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.360447 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.360564 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.371734 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.465934 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-log-httpd\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.466024 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.466122 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g7zn\" (UniqueName: \"kubernetes.io/projected/ebf07cac-5a1f-4988-97f5-869bafaa0072-kube-api-access-9g7zn\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.466159 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-config-data\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.466294 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-run-httpd\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.466334 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.466358 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-scripts\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.466413 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.567671 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-scripts\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.567751 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.567782 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-log-httpd\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.567820 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.567864 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g7zn\" (UniqueName: \"kubernetes.io/projected/ebf07cac-5a1f-4988-97f5-869bafaa0072-kube-api-access-9g7zn\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.567890 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-config-data\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.567960 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-run-httpd\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.567984 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.569569 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-log-httpd\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.569865 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-run-httpd\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.574477 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-config-data\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.575048 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.576178 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-scripts\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.582803 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.585927 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.591174 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g7zn\" (UniqueName: \"kubernetes.io/projected/ebf07cac-5a1f-4988-97f5-869bafaa0072-kube-api-access-9g7zn\") pod \"ceilometer-0\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " pod="openstack/ceilometer-0" Dec 10 12:16:07 crc kubenswrapper[4852]: I1210 12:16:07.683464 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.106051 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.139786 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2","Type":"ContainerStarted","Data":"4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8"} Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.139841 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2","Type":"ContainerStarted","Data":"7f2c1efe6785fbef8fac7084f84ccdcd51f293b51183c0bd95bcb03e96838033"} Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.141431 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e24574d5-d688-42bc-b424-0fca36afa981","Type":"ContainerStarted","Data":"fc8d70481fb0b61bba279033a8a92c3734ad340a913236591c8a59a160bc71b4"} Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.142342 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerStarted","Data":"42cf56a30e451835d437030a15a3b9a23b117ba560d1f46039008ec160495232"} Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.143888 4852 generic.go:334] "Generic (PLEG): container finished" podID="1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" containerID="66e553cae7591ba1c5daa51fce62c0e4f906c9fa7ead5729b146abf1dc87472d" exitCode=0 Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.143936 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b68mm" event={"ID":"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902","Type":"ContainerDied","Data":"66e553cae7591ba1c5daa51fce62c0e4f906c9fa7ead5729b146abf1dc87472d"} Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.163816 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.163796356 podStartE2EDuration="2.163796356s" podCreationTimestamp="2025-12-10 12:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:08.155242563 +0000 UTC m=+1454.240767817" watchObservedRunningTime="2025-12-10 12:16:08.163796356 +0000 UTC m=+1454.249321600" Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.182551 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22d60e4a-73d6-4a31-a915-b336ce32d34f" path="/var/lib/kubelet/pods/22d60e4a-73d6-4a31-a915-b336ce32d34f/volumes" Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.183997 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a4146b-61bc-4522-8247-43bbfa3dfed7" path="/var/lib/kubelet/pods/67a4146b-61bc-4522-8247-43bbfa3dfed7/volumes" Dec 10 12:16:08 crc kubenswrapper[4852]: I1210 12:16:08.185599 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ae3fdf-8e16-4763-8f50-389a52334458" path="/var/lib/kubelet/pods/d9ae3fdf-8e16-4763-8f50-389a52334458/volumes" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.155707 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n5vm" event={"ID":"effe5adf-90b0-4a00-a67a-589d4f355203","Type":"ContainerStarted","Data":"98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71"} Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.159124 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2","Type":"ContainerStarted","Data":"f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913"} Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.181459 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5n5vm" podStartSLOduration=4.191211298 podStartE2EDuration="7.181444265s" podCreationTimestamp="2025-12-10 12:16:02 +0000 UTC" firstStartedPulling="2025-12-10 12:16:05.048749031 +0000 UTC m=+1451.134274255" lastFinishedPulling="2025-12-10 12:16:08.038981998 +0000 UTC m=+1454.124507222" observedRunningTime="2025-12-10 12:16:09.176753317 +0000 UTC m=+1455.262278541" watchObservedRunningTime="2025-12-10 12:16:09.181444265 +0000 UTC m=+1455.266969489" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.453312 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.454475 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.560376 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.587370 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.587349097 podStartE2EDuration="3.587349097s" podCreationTimestamp="2025-12-10 12:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:09.213622939 +0000 UTC m=+1455.299148173" watchObservedRunningTime="2025-12-10 12:16:09.587349097 +0000 UTC m=+1455.672874331" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.623170 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-config-data\") pod \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.623289 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8tgn\" (UniqueName: \"kubernetes.io/projected/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-kube-api-access-h8tgn\") pod \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.623457 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-combined-ca-bundle\") pod \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.623501 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-scripts\") pod \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\" (UID: \"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902\") " Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.630544 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-kube-api-access-h8tgn" (OuterVolumeSpecName: "kube-api-access-h8tgn") pod "1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" (UID: "1eefb3c9-b84f-4ce8-9b0e-80e3187c7902"). InnerVolumeSpecName "kube-api-access-h8tgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.641515 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-scripts" (OuterVolumeSpecName: "scripts") pod "1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" (UID: "1eefb3c9-b84f-4ce8-9b0e-80e3187c7902"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.656580 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" (UID: "1eefb3c9-b84f-4ce8-9b0e-80e3187c7902"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.659243 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-config-data" (OuterVolumeSpecName: "config-data") pod "1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" (UID: "1eefb3c9-b84f-4ce8-9b0e-80e3187c7902"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.725736 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.726203 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.726316 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8tgn\" (UniqueName: \"kubernetes.io/projected/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-kube-api-access-h8tgn\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:09 crc kubenswrapper[4852]: I1210 12:16:09.726387 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.171317 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b68mm" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.187461 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerStarted","Data":"9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678"} Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.187600 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b68mm" event={"ID":"1eefb3c9-b84f-4ce8-9b0e-80e3187c7902","Type":"ContainerDied","Data":"a6f0fe96f3752f8e139775e3c8c69fe2e3b90e3ba8522a3aa0be5b1355068702"} Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.187667 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6f0fe96f3752f8e139775e3c8c69fe2e3b90e3ba8522a3aa0be5b1355068702" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.259333 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 10 12:16:10 crc kubenswrapper[4852]: E1210 12:16:10.259759 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" containerName="nova-cell1-conductor-db-sync" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.259775 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" containerName="nova-cell1-conductor-db-sync" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.259972 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" containerName="nova-cell1-conductor-db-sync" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.260625 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.262933 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.267194 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.339328 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqz9r\" (UniqueName: \"kubernetes.io/projected/a903af04-d97a-42ba-94c4-af5d3c84de08-kube-api-access-nqz9r\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.339416 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a903af04-d97a-42ba-94c4-af5d3c84de08-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.339448 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a903af04-d97a-42ba-94c4-af5d3c84de08-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.441005 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqz9r\" (UniqueName: \"kubernetes.io/projected/a903af04-d97a-42ba-94c4-af5d3c84de08-kube-api-access-nqz9r\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.441119 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a903af04-d97a-42ba-94c4-af5d3c84de08-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.441155 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a903af04-d97a-42ba-94c4-af5d3c84de08-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.449021 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a903af04-d97a-42ba-94c4-af5d3c84de08-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.449150 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a903af04-d97a-42ba-94c4-af5d3c84de08-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.462841 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqz9r\" (UniqueName: \"kubernetes.io/projected/a903af04-d97a-42ba-94c4-af5d3c84de08-kube-api-access-nqz9r\") pod \"nova-cell1-conductor-0\" (UID: \"a903af04-d97a-42ba-94c4-af5d3c84de08\") " pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:10 crc kubenswrapper[4852]: I1210 12:16:10.575901 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:11 crc kubenswrapper[4852]: I1210 12:16:11.041793 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 10 12:16:11 crc kubenswrapper[4852]: I1210 12:16:11.184648 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a903af04-d97a-42ba-94c4-af5d3c84de08","Type":"ContainerStarted","Data":"9bd3ba3dfdeb4f451fe5039a2c226d75ea10832eed084ab2bb4a3d7f206aaa63"} Dec 10 12:16:11 crc kubenswrapper[4852]: I1210 12:16:11.623552 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 10 12:16:12 crc kubenswrapper[4852]: I1210 12:16:12.344201 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 10 12:16:12 crc kubenswrapper[4852]: I1210 12:16:12.661466 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:12 crc kubenswrapper[4852]: I1210 12:16:12.661516 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:13 crc kubenswrapper[4852]: I1210 12:16:13.709759 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5n5vm" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="registry-server" probeResult="failure" output=< Dec 10 12:16:13 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 12:16:13 crc kubenswrapper[4852]: > Dec 10 12:16:14 crc kubenswrapper[4852]: I1210 12:16:14.454086 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 10 12:16:14 crc kubenswrapper[4852]: I1210 12:16:14.454130 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 10 12:16:15 crc kubenswrapper[4852]: I1210 12:16:15.464420 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 10 12:16:15 crc kubenswrapper[4852]: I1210 12:16:15.464754 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 10 12:16:15 crc kubenswrapper[4852]: I1210 12:16:15.790801 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:16:15 crc kubenswrapper[4852]: I1210 12:16:15.791105 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:16:16 crc kubenswrapper[4852]: I1210 12:16:16.623618 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 10 12:16:16 crc kubenswrapper[4852]: I1210 12:16:16.652428 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 10 12:16:16 crc kubenswrapper[4852]: I1210 12:16:16.663109 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 10 12:16:16 crc kubenswrapper[4852]: I1210 12:16:16.663181 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 10 12:16:17 crc kubenswrapper[4852]: I1210 12:16:17.275669 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 10 12:16:17 crc kubenswrapper[4852]: I1210 12:16:17.746515 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 10 12:16:17 crc kubenswrapper[4852]: I1210 12:16:17.746756 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 10 12:16:20 crc kubenswrapper[4852]: I1210 12:16:20.266691 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a903af04-d97a-42ba-94c4-af5d3c84de08","Type":"ContainerStarted","Data":"ee6546852dfcd7a015d32f6d4d72c624f4902e1d6788e575706ff738525419f7"} Dec 10 12:16:20 crc kubenswrapper[4852]: I1210 12:16:20.267220 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:20 crc kubenswrapper[4852]: I1210 12:16:20.285014 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=10.284998658 podStartE2EDuration="10.284998658s" podCreationTimestamp="2025-12-10 12:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:20.283958472 +0000 UTC m=+1466.369483706" watchObservedRunningTime="2025-12-10 12:16:20.284998658 +0000 UTC m=+1466.370523872" Dec 10 12:16:21 crc kubenswrapper[4852]: I1210 12:16:21.278289 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerStarted","Data":"fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725"} Dec 10 12:16:22 crc kubenswrapper[4852]: I1210 12:16:22.708558 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:22 crc kubenswrapper[4852]: I1210 12:16:22.766771 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:22 crc kubenswrapper[4852]: I1210 12:16:22.949567 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5n5vm"] Dec 10 12:16:23 crc kubenswrapper[4852]: I1210 12:16:23.296921 4852 generic.go:334] "Generic (PLEG): container finished" podID="4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" containerID="c58b23ccaf518a2f72729195b40d6bca8271c7eacb17384f74613307f5af9fe1" exitCode=137 Dec 10 12:16:23 crc kubenswrapper[4852]: I1210 12:16:23.297003 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2","Type":"ContainerDied","Data":"c58b23ccaf518a2f72729195b40d6bca8271c7eacb17384f74613307f5af9fe1"} Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.309677 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2","Type":"ContainerDied","Data":"f76878bc782ca6d28c9f76042206a6f02b134a9a41842ebde51c5af6f49bbfa4"} Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.310324 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f76878bc782ca6d28c9f76042206a6f02b134a9a41842ebde51c5af6f49bbfa4" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.309836 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5n5vm" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="registry-server" containerID="cri-o://98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71" gracePeriod=2 Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.347793 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.401991 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f56v\" (UniqueName: \"kubernetes.io/projected/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-kube-api-access-8f56v\") pod \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.402058 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-combined-ca-bundle\") pod \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.402156 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-config-data\") pod \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\" (UID: \"4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2\") " Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.408700 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-kube-api-access-8f56v" (OuterVolumeSpecName: "kube-api-access-8f56v") pod "4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" (UID: "4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2"). InnerVolumeSpecName "kube-api-access-8f56v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.438473 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-config-data" (OuterVolumeSpecName: "config-data") pod "4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" (UID: "4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.439169 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" (UID: "4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.461759 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.462611 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.469020 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.506843 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.506869 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f56v\" (UniqueName: \"kubernetes.io/projected/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-kube-api-access-8f56v\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.506878 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.752920 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.814263 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-utilities\") pod \"effe5adf-90b0-4a00-a67a-589d4f355203\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.814382 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-catalog-content\") pod \"effe5adf-90b0-4a00-a67a-589d4f355203\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.814500 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzs59\" (UniqueName: \"kubernetes.io/projected/effe5adf-90b0-4a00-a67a-589d4f355203-kube-api-access-lzs59\") pod \"effe5adf-90b0-4a00-a67a-589d4f355203\" (UID: \"effe5adf-90b0-4a00-a67a-589d4f355203\") " Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.816860 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-utilities" (OuterVolumeSpecName: "utilities") pod "effe5adf-90b0-4a00-a67a-589d4f355203" (UID: "effe5adf-90b0-4a00-a67a-589d4f355203"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.822382 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effe5adf-90b0-4a00-a67a-589d4f355203-kube-api-access-lzs59" (OuterVolumeSpecName: "kube-api-access-lzs59") pod "effe5adf-90b0-4a00-a67a-589d4f355203" (UID: "effe5adf-90b0-4a00-a67a-589d4f355203"). InnerVolumeSpecName "kube-api-access-lzs59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.919443 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.919477 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzs59\" (UniqueName: \"kubernetes.io/projected/effe5adf-90b0-4a00-a67a-589d4f355203-kube-api-access-lzs59\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:24 crc kubenswrapper[4852]: I1210 12:16:24.940338 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "effe5adf-90b0-4a00-a67a-589d4f355203" (UID: "effe5adf-90b0-4a00-a67a-589d4f355203"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.021640 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/effe5adf-90b0-4a00-a67a-589d4f355203-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.321078 4852 generic.go:334] "Generic (PLEG): container finished" podID="effe5adf-90b0-4a00-a67a-589d4f355203" containerID="98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71" exitCode=0 Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.321153 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n5vm" event={"ID":"effe5adf-90b0-4a00-a67a-589d4f355203","Type":"ContainerDied","Data":"98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71"} Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.321172 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5n5vm" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.321206 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n5vm" event={"ID":"effe5adf-90b0-4a00-a67a-589d4f355203","Type":"ContainerDied","Data":"e123e4576f4b383cb5d0c619f8d7d4de390f015a2f185a0ed7e39b024a470cc1"} Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.321250 4852 scope.go:117] "RemoveContainer" containerID="98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.326746 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerStarted","Data":"afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87"} Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.326781 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.346459 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.356028 4852 scope.go:117] "RemoveContainer" containerID="ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.379184 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.395595 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.404446 4852 scope.go:117] "RemoveContainer" containerID="b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.413407 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5n5vm"] Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.427684 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5n5vm"] Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.440185 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:16:25 crc kubenswrapper[4852]: E1210 12:16:25.440755 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="registry-server" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.440778 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="registry-server" Dec 10 12:16:25 crc kubenswrapper[4852]: E1210 12:16:25.440806 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="extract-utilities" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.440816 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="extract-utilities" Dec 10 12:16:25 crc kubenswrapper[4852]: E1210 12:16:25.440858 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="extract-content" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.440867 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="extract-content" Dec 10 12:16:25 crc kubenswrapper[4852]: E1210 12:16:25.440889 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" containerName="nova-cell1-novncproxy-novncproxy" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.440897 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" containerName="nova-cell1-novncproxy-novncproxy" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.441139 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" containerName="nova-cell1-novncproxy-novncproxy" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.441172 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" containerName="registry-server" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.442006 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.444619 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.444756 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.444857 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.445894 4852 scope.go:117] "RemoveContainer" containerID="98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71" Dec 10 12:16:25 crc kubenswrapper[4852]: E1210 12:16:25.447001 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71\": container with ID starting with 98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71 not found: ID does not exist" containerID="98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.447033 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71"} err="failed to get container status \"98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71\": rpc error: code = NotFound desc = could not find container \"98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71\": container with ID starting with 98df0d7e6c8f26f86c17cce15961f1f681f8dbdd2ae0f7de333e979dfc7c9d71 not found: ID does not exist" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.447057 4852 scope.go:117] "RemoveContainer" containerID="ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded" Dec 10 12:16:25 crc kubenswrapper[4852]: E1210 12:16:25.447486 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded\": container with ID starting with ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded not found: ID does not exist" containerID="ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.447517 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded"} err="failed to get container status \"ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded\": rpc error: code = NotFound desc = could not find container \"ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded\": container with ID starting with ef24645178e836d8a4521e27e976d94f97257d72732f1d0b93d6bf7b62d0eded not found: ID does not exist" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.447533 4852 scope.go:117] "RemoveContainer" containerID="b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9" Dec 10 12:16:25 crc kubenswrapper[4852]: E1210 12:16:25.448624 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9\": container with ID starting with b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9 not found: ID does not exist" containerID="b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.448652 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9"} err="failed to get container status \"b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9\": rpc error: code = NotFound desc = could not find container \"b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9\": container with ID starting with b957facf26152151e4355be0eeff4c50fc674bbcb9686c0e40a5f8fad55ee7f9 not found: ID does not exist" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.464479 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.531456 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.531714 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.531749 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.531808 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.531869 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c5z2\" (UniqueName: \"kubernetes.io/projected/5f6d8b73-adeb-47cd-9150-613bda06874e-kube-api-access-5c5z2\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.611013 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.634052 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.634121 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.634186 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.634247 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.634287 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c5z2\" (UniqueName: \"kubernetes.io/projected/5f6d8b73-adeb-47cd-9150-613bda06874e-kube-api-access-5c5z2\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.641697 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.642609 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.650743 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.652015 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f6d8b73-adeb-47cd-9150-613bda06874e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.655897 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c5z2\" (UniqueName: \"kubernetes.io/projected/5f6d8b73-adeb-47cd-9150-613bda06874e-kube-api-access-5c5z2\") pod \"nova-cell1-novncproxy-0\" (UID: \"5f6d8b73-adeb-47cd-9150-613bda06874e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:25 crc kubenswrapper[4852]: I1210 12:16:25.801270 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.185981 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2" path="/var/lib/kubelet/pods/4a78d9e0-dff9-4e29-98cd-e5b6749ad5c2/volumes" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.187083 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effe5adf-90b0-4a00-a67a-589d4f355203" path="/var/lib/kubelet/pods/effe5adf-90b0-4a00-a67a-589d4f355203/volumes" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.266816 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 10 12:16:26 crc kubenswrapper[4852]: W1210 12:16:26.267887 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f6d8b73_adeb_47cd_9150_613bda06874e.slice/crio-180e853b181e091ea02ecf245b0e34eebcf1efb923b91e53c9a62bb1006468ea WatchSource:0}: Error finding container 180e853b181e091ea02ecf245b0e34eebcf1efb923b91e53c9a62bb1006468ea: Status 404 returned error can't find the container with id 180e853b181e091ea02ecf245b0e34eebcf1efb923b91e53c9a62bb1006468ea Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.340499 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5f6d8b73-adeb-47cd-9150-613bda06874e","Type":"ContainerStarted","Data":"180e853b181e091ea02ecf245b0e34eebcf1efb923b91e53c9a62bb1006468ea"} Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.666385 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.667921 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.668168 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.668206 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.671285 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.671535 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.881472 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vxvl2"] Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.887328 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.905632 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vxvl2"] Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.967586 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.967639 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.967692 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.967738 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.967786 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-config\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:26 crc kubenswrapper[4852]: I1210 12:16:26.967822 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwp8s\" (UniqueName: \"kubernetes.io/projected/e54e2f70-296d-4e1c-a293-72b7a09e1e35-kube-api-access-jwp8s\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.069906 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.070199 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-config\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.070380 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwp8s\" (UniqueName: \"kubernetes.io/projected/e54e2f70-296d-4e1c-a293-72b7a09e1e35-kube-api-access-jwp8s\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.070561 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.070679 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.070826 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.071954 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.072755 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.074546 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-config\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.079332 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.079482 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.091255 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwp8s\" (UniqueName: \"kubernetes.io/projected/e54e2f70-296d-4e1c-a293-72b7a09e1e35-kube-api-access-jwp8s\") pod \"dnsmasq-dns-cd5cbd7b9-vxvl2\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.228048 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.364731 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5f6d8b73-adeb-47cd-9150-613bda06874e","Type":"ContainerStarted","Data":"9ec9a2dbe6f7bec47be1f34db70592a8ad31c14c7328622a7df1385241762eca"} Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.378653 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerStarted","Data":"0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0"} Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.389601 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.38958258 podStartE2EDuration="2.38958258s" podCreationTimestamp="2025-12-10 12:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:27.382055562 +0000 UTC m=+1473.467580786" watchObservedRunningTime="2025-12-10 12:16:27.38958258 +0000 UTC m=+1473.475107794" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.407469 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.562996882 podStartE2EDuration="20.407447837s" podCreationTimestamp="2025-12-10 12:16:07 +0000 UTC" firstStartedPulling="2025-12-10 12:16:08.111296455 +0000 UTC m=+1454.196821669" lastFinishedPulling="2025-12-10 12:16:26.9557474 +0000 UTC m=+1473.041272624" observedRunningTime="2025-12-10 12:16:27.406660647 +0000 UTC m=+1473.492185871" watchObservedRunningTime="2025-12-10 12:16:27.407447837 +0000 UTC m=+1473.492973061" Dec 10 12:16:27 crc kubenswrapper[4852]: I1210 12:16:27.715555 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vxvl2"] Dec 10 12:16:27 crc kubenswrapper[4852]: W1210 12:16:27.728758 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode54e2f70_296d_4e1c_a293_72b7a09e1e35.slice/crio-c8495202f53c575b920450d450e69a7bc8553b11eb16caaea7dc0e39233b8d26 WatchSource:0}: Error finding container c8495202f53c575b920450d450e69a7bc8553b11eb16caaea7dc0e39233b8d26: Status 404 returned error can't find the container with id c8495202f53c575b920450d450e69a7bc8553b11eb16caaea7dc0e39233b8d26 Dec 10 12:16:28 crc kubenswrapper[4852]: I1210 12:16:28.388841 4852 generic.go:334] "Generic (PLEG): container finished" podID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" containerID="fbda619d8439d29cfc5c6e9954351b828053472ffc928c07ef12197aa6ba71f8" exitCode=0 Dec 10 12:16:28 crc kubenswrapper[4852]: I1210 12:16:28.389067 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" event={"ID":"e54e2f70-296d-4e1c-a293-72b7a09e1e35","Type":"ContainerDied","Data":"fbda619d8439d29cfc5c6e9954351b828053472ffc928c07ef12197aa6ba71f8"} Dec 10 12:16:28 crc kubenswrapper[4852]: I1210 12:16:28.389346 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" event={"ID":"e54e2f70-296d-4e1c-a293-72b7a09e1e35","Type":"ContainerStarted","Data":"c8495202f53c575b920450d450e69a7bc8553b11eb16caaea7dc0e39233b8d26"} Dec 10 12:16:28 crc kubenswrapper[4852]: I1210 12:16:28.389652 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 10 12:16:29 crc kubenswrapper[4852]: I1210 12:16:29.215500 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:29 crc kubenswrapper[4852]: I1210 12:16:29.395744 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:29 crc kubenswrapper[4852]: I1210 12:16:29.402035 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-log" containerID="cri-o://4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8" gracePeriod=30 Dec 10 12:16:29 crc kubenswrapper[4852]: I1210 12:16:29.403728 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" event={"ID":"e54e2f70-296d-4e1c-a293-72b7a09e1e35","Type":"ContainerStarted","Data":"b7241b7a62bb67de94d7caf316ae111319baeba133bfa65463dcf02e0b6c550c"} Dec 10 12:16:29 crc kubenswrapper[4852]: I1210 12:16:29.403761 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:29 crc kubenswrapper[4852]: I1210 12:16:29.405012 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-api" containerID="cri-o://f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913" gracePeriod=30 Dec 10 12:16:29 crc kubenswrapper[4852]: I1210 12:16:29.433310 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" podStartSLOduration=3.433291086 podStartE2EDuration="3.433291086s" podCreationTimestamp="2025-12-10 12:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:29.432393603 +0000 UTC m=+1475.517918847" watchObservedRunningTime="2025-12-10 12:16:29.433291086 +0000 UTC m=+1475.518816310" Dec 10 12:16:30 crc kubenswrapper[4852]: I1210 12:16:30.412946 4852 generic.go:334] "Generic (PLEG): container finished" podID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerID="4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8" exitCode=143 Dec 10 12:16:30 crc kubenswrapper[4852]: I1210 12:16:30.413031 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2","Type":"ContainerDied","Data":"4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8"} Dec 10 12:16:30 crc kubenswrapper[4852]: I1210 12:16:30.413720 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="proxy-httpd" containerID="cri-o://0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0" gracePeriod=30 Dec 10 12:16:30 crc kubenswrapper[4852]: I1210 12:16:30.413740 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="sg-core" containerID="cri-o://afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87" gracePeriod=30 Dec 10 12:16:30 crc kubenswrapper[4852]: I1210 12:16:30.413722 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="ceilometer-central-agent" containerID="cri-o://9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678" gracePeriod=30 Dec 10 12:16:30 crc kubenswrapper[4852]: I1210 12:16:30.413823 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="ceilometer-notification-agent" containerID="cri-o://fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725" gracePeriod=30 Dec 10 12:16:30 crc kubenswrapper[4852]: I1210 12:16:30.814401 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:31 crc kubenswrapper[4852]: I1210 12:16:31.426830 4852 generic.go:334] "Generic (PLEG): container finished" podID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerID="0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0" exitCode=0 Dec 10 12:16:31 crc kubenswrapper[4852]: I1210 12:16:31.426880 4852 generic.go:334] "Generic (PLEG): container finished" podID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerID="afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87" exitCode=2 Dec 10 12:16:31 crc kubenswrapper[4852]: I1210 12:16:31.426892 4852 generic.go:334] "Generic (PLEG): container finished" podID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerID="9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678" exitCode=0 Dec 10 12:16:31 crc kubenswrapper[4852]: I1210 12:16:31.426927 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerDied","Data":"0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0"} Dec 10 12:16:31 crc kubenswrapper[4852]: I1210 12:16:31.426960 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerDied","Data":"afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87"} Dec 10 12:16:31 crc kubenswrapper[4852]: I1210 12:16:31.426975 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerDied","Data":"9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678"} Dec 10 12:16:31 crc kubenswrapper[4852]: I1210 12:16:31.917740 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032152 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-config-data\") pod \"ebf07cac-5a1f-4988-97f5-869bafaa0072\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032196 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g7zn\" (UniqueName: \"kubernetes.io/projected/ebf07cac-5a1f-4988-97f5-869bafaa0072-kube-api-access-9g7zn\") pod \"ebf07cac-5a1f-4988-97f5-869bafaa0072\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032229 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-run-httpd\") pod \"ebf07cac-5a1f-4988-97f5-869bafaa0072\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032376 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-sg-core-conf-yaml\") pod \"ebf07cac-5a1f-4988-97f5-869bafaa0072\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032447 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-log-httpd\") pod \"ebf07cac-5a1f-4988-97f5-869bafaa0072\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032471 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-scripts\") pod \"ebf07cac-5a1f-4988-97f5-869bafaa0072\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032517 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-combined-ca-bundle\") pod \"ebf07cac-5a1f-4988-97f5-869bafaa0072\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032538 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-ceilometer-tls-certs\") pod \"ebf07cac-5a1f-4988-97f5-869bafaa0072\" (UID: \"ebf07cac-5a1f-4988-97f5-869bafaa0072\") " Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.032845 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ebf07cac-5a1f-4988-97f5-869bafaa0072" (UID: "ebf07cac-5a1f-4988-97f5-869bafaa0072"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.033089 4852 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.033608 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ebf07cac-5a1f-4988-97f5-869bafaa0072" (UID: "ebf07cac-5a1f-4988-97f5-869bafaa0072"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.040501 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-scripts" (OuterVolumeSpecName: "scripts") pod "ebf07cac-5a1f-4988-97f5-869bafaa0072" (UID: "ebf07cac-5a1f-4988-97f5-869bafaa0072"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.040575 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf07cac-5a1f-4988-97f5-869bafaa0072-kube-api-access-9g7zn" (OuterVolumeSpecName: "kube-api-access-9g7zn") pod "ebf07cac-5a1f-4988-97f5-869bafaa0072" (UID: "ebf07cac-5a1f-4988-97f5-869bafaa0072"). InnerVolumeSpecName "kube-api-access-9g7zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.064348 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ebf07cac-5a1f-4988-97f5-869bafaa0072" (UID: "ebf07cac-5a1f-4988-97f5-869bafaa0072"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.086528 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ebf07cac-5a1f-4988-97f5-869bafaa0072" (UID: "ebf07cac-5a1f-4988-97f5-869bafaa0072"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.120890 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebf07cac-5a1f-4988-97f5-869bafaa0072" (UID: "ebf07cac-5a1f-4988-97f5-869bafaa0072"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.135370 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g7zn\" (UniqueName: \"kubernetes.io/projected/ebf07cac-5a1f-4988-97f5-869bafaa0072-kube-api-access-9g7zn\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.135404 4852 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.135420 4852 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebf07cac-5a1f-4988-97f5-869bafaa0072-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.135430 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.135441 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.135451 4852 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.148342 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-config-data" (OuterVolumeSpecName: "config-data") pod "ebf07cac-5a1f-4988-97f5-869bafaa0072" (UID: "ebf07cac-5a1f-4988-97f5-869bafaa0072"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.237642 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebf07cac-5a1f-4988-97f5-869bafaa0072-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.437644 4852 generic.go:334] "Generic (PLEG): container finished" podID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerID="fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725" exitCode=0 Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.437887 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerDied","Data":"fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725"} Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.437913 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebf07cac-5a1f-4988-97f5-869bafaa0072","Type":"ContainerDied","Data":"42cf56a30e451835d437030a15a3b9a23b117ba560d1f46039008ec160495232"} Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.437930 4852 scope.go:117] "RemoveContainer" containerID="0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.438039 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.466641 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.474119 4852 scope.go:117] "RemoveContainer" containerID="afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.480215 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.493165 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:32 crc kubenswrapper[4852]: E1210 12:16:32.493689 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="ceilometer-notification-agent" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.493712 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="ceilometer-notification-agent" Dec 10 12:16:32 crc kubenswrapper[4852]: E1210 12:16:32.493729 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="proxy-httpd" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.493740 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="proxy-httpd" Dec 10 12:16:32 crc kubenswrapper[4852]: E1210 12:16:32.493761 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="ceilometer-central-agent" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.493769 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="ceilometer-central-agent" Dec 10 12:16:32 crc kubenswrapper[4852]: E1210 12:16:32.493799 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="sg-core" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.493806 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="sg-core" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.494026 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="ceilometer-notification-agent" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.494047 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="sg-core" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.494073 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="ceilometer-central-agent" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.494090 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" containerName="proxy-httpd" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.496348 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.499649 4852 scope.go:117] "RemoveContainer" containerID="fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.499924 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.500036 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.504588 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.506717 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.538715 4852 scope.go:117] "RemoveContainer" containerID="9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.634482 4852 scope.go:117] "RemoveContainer" containerID="0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0" Dec 10 12:16:32 crc kubenswrapper[4852]: E1210 12:16:32.635170 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0\": container with ID starting with 0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0 not found: ID does not exist" containerID="0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.635207 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0"} err="failed to get container status \"0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0\": rpc error: code = NotFound desc = could not find container \"0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0\": container with ID starting with 0ff43d2b80a0d44367ceeaa6adac5e7fabfe14c98b14cd2c50d26dee3232f3f0 not found: ID does not exist" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.635249 4852 scope.go:117] "RemoveContainer" containerID="afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87" Dec 10 12:16:32 crc kubenswrapper[4852]: E1210 12:16:32.635619 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87\": container with ID starting with afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87 not found: ID does not exist" containerID="afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.635664 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87"} err="failed to get container status \"afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87\": rpc error: code = NotFound desc = could not find container \"afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87\": container with ID starting with afbf81d48d1172d85a762d700679aed8a6febb8a79ae23148bb63d5f54c01f87 not found: ID does not exist" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.635695 4852 scope.go:117] "RemoveContainer" containerID="fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725" Dec 10 12:16:32 crc kubenswrapper[4852]: E1210 12:16:32.636165 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725\": container with ID starting with fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725 not found: ID does not exist" containerID="fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.636186 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725"} err="failed to get container status \"fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725\": rpc error: code = NotFound desc = could not find container \"fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725\": container with ID starting with fdea62e2ea9a2f4c3a969e9699b5342d25bfc78e3b450fe94b5eb0ae2f6cf725 not found: ID does not exist" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.636206 4852 scope.go:117] "RemoveContainer" containerID="9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678" Dec 10 12:16:32 crc kubenswrapper[4852]: E1210 12:16:32.636647 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678\": container with ID starting with 9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678 not found: ID does not exist" containerID="9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.636673 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678"} err="failed to get container status \"9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678\": rpc error: code = NotFound desc = could not find container \"9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678\": container with ID starting with 9a369aa389240817ce66e6d009b85e4a95cd16976c7e19fa16a308ff86de5678 not found: ID does not exist" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.644299 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14750a4b-711e-443e-94aa-670159e43e44-log-httpd\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.644388 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.644433 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scgzh\" (UniqueName: \"kubernetes.io/projected/14750a4b-711e-443e-94aa-670159e43e44-kube-api-access-scgzh\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.644483 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-scripts\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.644513 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14750a4b-711e-443e-94aa-670159e43e44-run-httpd\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.644542 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.644612 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.644632 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-config-data\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.746860 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.746925 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scgzh\" (UniqueName: \"kubernetes.io/projected/14750a4b-711e-443e-94aa-670159e43e44-kube-api-access-scgzh\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.746985 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-scripts\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.747017 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14750a4b-711e-443e-94aa-670159e43e44-run-httpd\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.747038 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.747117 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.747170 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-config-data\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.747211 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14750a4b-711e-443e-94aa-670159e43e44-log-httpd\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.747776 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14750a4b-711e-443e-94aa-670159e43e44-log-httpd\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.749262 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14750a4b-711e-443e-94aa-670159e43e44-run-httpd\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.752495 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.752596 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.753764 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.754444 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-config-data\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.756107 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14750a4b-711e-443e-94aa-670159e43e44-scripts\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.766566 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scgzh\" (UniqueName: \"kubernetes.io/projected/14750a4b-711e-443e-94aa-670159e43e44-kube-api-access-scgzh\") pod \"ceilometer-0\" (UID: \"14750a4b-711e-443e-94aa-670159e43e44\") " pod="openstack/ceilometer-0" Dec 10 12:16:32 crc kubenswrapper[4852]: I1210 12:16:32.823450 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.298308 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 10 12:16:33 crc kubenswrapper[4852]: W1210 12:16:33.300319 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14750a4b_711e_443e_94aa_670159e43e44.slice/crio-d2ec2563d74eb5e982dd083ee5300c1ecf005ea872dce2ef5a31430be9e40081 WatchSource:0}: Error finding container d2ec2563d74eb5e982dd083ee5300c1ecf005ea872dce2ef5a31430be9e40081: Status 404 returned error can't find the container with id d2ec2563d74eb5e982dd083ee5300c1ecf005ea872dce2ef5a31430be9e40081 Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.436511 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.479677 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14750a4b-711e-443e-94aa-670159e43e44","Type":"ContainerStarted","Data":"d2ec2563d74eb5e982dd083ee5300c1ecf005ea872dce2ef5a31430be9e40081"} Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.487472 4852 generic.go:334] "Generic (PLEG): container finished" podID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerID="f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913" exitCode=0 Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.487581 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2","Type":"ContainerDied","Data":"f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913"} Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.487614 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2","Type":"ContainerDied","Data":"7f2c1efe6785fbef8fac7084f84ccdcd51f293b51183c0bd95bcb03e96838033"} Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.487635 4852 scope.go:117] "RemoveContainer" containerID="f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.487773 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.528504 4852 scope.go:117] "RemoveContainer" containerID="4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.561339 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6lds\" (UniqueName: \"kubernetes.io/projected/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-kube-api-access-k6lds\") pod \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.561404 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-logs\") pod \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.561423 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-combined-ca-bundle\") pod \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.561586 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-config-data\") pod \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\" (UID: \"f7dfedd6-3c79-47c8-8082-e76c2d9e47d2\") " Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.566543 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-logs" (OuterVolumeSpecName: "logs") pod "f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" (UID: "f7dfedd6-3c79-47c8-8082-e76c2d9e47d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.569204 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-kube-api-access-k6lds" (OuterVolumeSpecName: "kube-api-access-k6lds") pod "f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" (UID: "f7dfedd6-3c79-47c8-8082-e76c2d9e47d2"). InnerVolumeSpecName "kube-api-access-k6lds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.594577 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-config-data" (OuterVolumeSpecName: "config-data") pod "f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" (UID: "f7dfedd6-3c79-47c8-8082-e76c2d9e47d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.610157 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" (UID: "f7dfedd6-3c79-47c8-8082-e76c2d9e47d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.663603 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6lds\" (UniqueName: \"kubernetes.io/projected/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-kube-api-access-k6lds\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.663646 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.663656 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.663667 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.691498 4852 scope.go:117] "RemoveContainer" containerID="f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913" Dec 10 12:16:33 crc kubenswrapper[4852]: E1210 12:16:33.691944 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913\": container with ID starting with f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913 not found: ID does not exist" containerID="f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.691983 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913"} err="failed to get container status \"f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913\": rpc error: code = NotFound desc = could not find container \"f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913\": container with ID starting with f908e77aef29d38b7d2b9409456c232d0b9490103c22cbf9e0bb39ff51766913 not found: ID does not exist" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.692011 4852 scope.go:117] "RemoveContainer" containerID="4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8" Dec 10 12:16:33 crc kubenswrapper[4852]: E1210 12:16:33.692515 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8\": container with ID starting with 4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8 not found: ID does not exist" containerID="4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.692564 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8"} err="failed to get container status \"4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8\": rpc error: code = NotFound desc = could not find container \"4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8\": container with ID starting with 4f61a6a2b1e0f95adff2d3286b55bd2afb69fb23d5aed4f222bcb46abd5ecaa8 not found: ID does not exist" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.820742 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.829797 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.842982 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:33 crc kubenswrapper[4852]: E1210 12:16:33.844157 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-api" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.844179 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-api" Dec 10 12:16:33 crc kubenswrapper[4852]: E1210 12:16:33.844203 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-log" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.844211 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-log" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.844593 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-log" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.844617 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" containerName="nova-api-api" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.845607 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.849179 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.853175 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.853341 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.859487 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.969013 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-public-tls-certs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.969435 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-config-data\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.969528 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.969581 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.969623 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-logs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:33 crc kubenswrapper[4852]: I1210 12:16:33.969659 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4m4\" (UniqueName: \"kubernetes.io/projected/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-kube-api-access-zl4m4\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.071141 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-config-data\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.071248 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.071288 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.071308 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-logs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.071359 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4m4\" (UniqueName: \"kubernetes.io/projected/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-kube-api-access-zl4m4\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.071441 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-public-tls-certs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.071952 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-logs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.076173 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.077058 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-config-data\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.079679 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-public-tls-certs\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.082450 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.089006 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4m4\" (UniqueName: \"kubernetes.io/projected/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-kube-api-access-zl4m4\") pod \"nova-api-0\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.188651 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.202023 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf07cac-5a1f-4988-97f5-869bafaa0072" path="/var/lib/kubelet/pods/ebf07cac-5a1f-4988-97f5-869bafaa0072/volumes" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.203183 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dfedd6-3c79-47c8-8082-e76c2d9e47d2" path="/var/lib/kubelet/pods/f7dfedd6-3c79-47c8-8082-e76c2d9e47d2/volumes" Dec 10 12:16:34 crc kubenswrapper[4852]: I1210 12:16:34.635162 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:34 crc kubenswrapper[4852]: W1210 12:16:34.637036 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9b8c1c5_76a2_4c98_bb1b_589b0863ce83.slice/crio-a6d6fd39c948393ff36ca56279b6b8e8c5e3e7145c855d29ffaf0b2582c22546 WatchSource:0}: Error finding container a6d6fd39c948393ff36ca56279b6b8e8c5e3e7145c855d29ffaf0b2582c22546: Status 404 returned error can't find the container with id a6d6fd39c948393ff36ca56279b6b8e8c5e3e7145c855d29ffaf0b2582c22546 Dec 10 12:16:35 crc kubenswrapper[4852]: I1210 12:16:35.518771 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83","Type":"ContainerStarted","Data":"1ffe03c76992cec29c6f69e23aa99890b376e1e78b78461f84806272db8dccb7"} Dec 10 12:16:35 crc kubenswrapper[4852]: I1210 12:16:35.519384 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83","Type":"ContainerStarted","Data":"5bc45a10e41ffb6cd0f3175e59b83e6eef74c5d06c4c7e9820d0d98800596681"} Dec 10 12:16:35 crc kubenswrapper[4852]: I1210 12:16:35.519482 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83","Type":"ContainerStarted","Data":"a6d6fd39c948393ff36ca56279b6b8e8c5e3e7145c855d29ffaf0b2582c22546"} Dec 10 12:16:35 crc kubenswrapper[4852]: I1210 12:16:35.523270 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14750a4b-711e-443e-94aa-670159e43e44","Type":"ContainerStarted","Data":"62e45720e4fd31c16cb7da4b6896a460be42b28e5277325d491d43f6bc621444"} Dec 10 12:16:35 crc kubenswrapper[4852]: I1210 12:16:35.563207 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5631873240000003 podStartE2EDuration="2.563187324s" podCreationTimestamp="2025-12-10 12:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:35.551267527 +0000 UTC m=+1481.636792761" watchObservedRunningTime="2025-12-10 12:16:35.563187324 +0000 UTC m=+1481.648712558" Dec 10 12:16:35 crc kubenswrapper[4852]: I1210 12:16:35.802347 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:35 crc kubenswrapper[4852]: I1210 12:16:35.823760 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.534173 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14750a4b-711e-443e-94aa-670159e43e44","Type":"ContainerStarted","Data":"cd5a44e0fa608ddba96a2fb695f22ac3c7e8138271f1ad3e6ad2cf451f4697b8"} Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.550731 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.721477 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-r97dm"] Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.723529 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.726151 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.726162 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.733131 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r97dm"] Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.822069 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzqj\" (UniqueName: \"kubernetes.io/projected/481d8815-c8ec-4eeb-aad1-bb28f7161829-kube-api-access-mjzqj\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.822494 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-config-data\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.822623 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.822795 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-scripts\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.924622 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-scripts\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.924744 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzqj\" (UniqueName: \"kubernetes.io/projected/481d8815-c8ec-4eeb-aad1-bb28f7161829-kube-api-access-mjzqj\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.924834 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-config-data\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.924884 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.929326 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-scripts\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.930846 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.931685 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-config-data\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:36 crc kubenswrapper[4852]: I1210 12:16:36.940708 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzqj\" (UniqueName: \"kubernetes.io/projected/481d8815-c8ec-4eeb-aad1-bb28f7161829-kube-api-access-mjzqj\") pod \"nova-cell1-cell-mapping-r97dm\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.040145 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.231405 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.316004 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-x7855"] Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.316641 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-x7855" podUID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerName="dnsmasq-dns" containerID="cri-o://1e65996dcd79af814e189fefb8a535c89ac0f12b73ef2b900b3f8f137a86bb29" gracePeriod=10 Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.468608 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bccf8f775-x7855" podUID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: connect: connection refused" Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.523693 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r97dm"] Dec 10 12:16:37 crc kubenswrapper[4852]: W1210 12:16:37.540109 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481d8815_c8ec_4eeb_aad1_bb28f7161829.slice/crio-8f4fc2f9be2557e8319873b9bc42670de3390c4618dd6edde18fbfb1e95ffd0e WatchSource:0}: Error finding container 8f4fc2f9be2557e8319873b9bc42670de3390c4618dd6edde18fbfb1e95ffd0e: Status 404 returned error can't find the container with id 8f4fc2f9be2557e8319873b9bc42670de3390c4618dd6edde18fbfb1e95ffd0e Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.564956 4852 generic.go:334] "Generic (PLEG): container finished" podID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerID="1e65996dcd79af814e189fefb8a535c89ac0f12b73ef2b900b3f8f137a86bb29" exitCode=0 Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.565063 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-x7855" event={"ID":"e127297c-061e-457a-9c2a-5794a1f39a3a","Type":"ContainerDied","Data":"1e65996dcd79af814e189fefb8a535c89ac0f12b73ef2b900b3f8f137a86bb29"} Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.576523 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14750a4b-711e-443e-94aa-670159e43e44","Type":"ContainerStarted","Data":"90c886ff89a2dcefa4d996b6e82f8a9a579e017cc3002f175e95b2ece61ab33e"} Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.826857 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.948756 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-sb\") pod \"e127297c-061e-457a-9c2a-5794a1f39a3a\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.948820 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-svc\") pod \"e127297c-061e-457a-9c2a-5794a1f39a3a\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.948867 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2rnx\" (UniqueName: \"kubernetes.io/projected/e127297c-061e-457a-9c2a-5794a1f39a3a-kube-api-access-g2rnx\") pod \"e127297c-061e-457a-9c2a-5794a1f39a3a\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.948896 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-nb\") pod \"e127297c-061e-457a-9c2a-5794a1f39a3a\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.949029 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-config\") pod \"e127297c-061e-457a-9c2a-5794a1f39a3a\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.949089 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-swift-storage-0\") pod \"e127297c-061e-457a-9c2a-5794a1f39a3a\" (UID: \"e127297c-061e-457a-9c2a-5794a1f39a3a\") " Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.961394 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e127297c-061e-457a-9c2a-5794a1f39a3a-kube-api-access-g2rnx" (OuterVolumeSpecName: "kube-api-access-g2rnx") pod "e127297c-061e-457a-9c2a-5794a1f39a3a" (UID: "e127297c-061e-457a-9c2a-5794a1f39a3a"). InnerVolumeSpecName "kube-api-access-g2rnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:37 crc kubenswrapper[4852]: I1210 12:16:37.999082 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e127297c-061e-457a-9c2a-5794a1f39a3a" (UID: "e127297c-061e-457a-9c2a-5794a1f39a3a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.008367 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e127297c-061e-457a-9c2a-5794a1f39a3a" (UID: "e127297c-061e-457a-9c2a-5794a1f39a3a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.014123 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e127297c-061e-457a-9c2a-5794a1f39a3a" (UID: "e127297c-061e-457a-9c2a-5794a1f39a3a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.016463 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e127297c-061e-457a-9c2a-5794a1f39a3a" (UID: "e127297c-061e-457a-9c2a-5794a1f39a3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.026400 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-config" (OuterVolumeSpecName: "config") pod "e127297c-061e-457a-9c2a-5794a1f39a3a" (UID: "e127297c-061e-457a-9c2a-5794a1f39a3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.053091 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.053130 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.053144 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.053155 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2rnx\" (UniqueName: \"kubernetes.io/projected/e127297c-061e-457a-9c2a-5794a1f39a3a-kube-api-access-g2rnx\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.053168 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.053180 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e127297c-061e-457a-9c2a-5794a1f39a3a-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.590177 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-x7855" event={"ID":"e127297c-061e-457a-9c2a-5794a1f39a3a","Type":"ContainerDied","Data":"c19d68174beb38f04cfb0a72e72a7613ea3d967ed23a81449cddeaf6fc84970f"} Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.590473 4852 scope.go:117] "RemoveContainer" containerID="1e65996dcd79af814e189fefb8a535c89ac0f12b73ef2b900b3f8f137a86bb29" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.590390 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-x7855" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.594067 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14750a4b-711e-443e-94aa-670159e43e44","Type":"ContainerStarted","Data":"79eded5f2645bc5b891ae6810dd62698fb3878da23849ad74e14fbc953ca18f9"} Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.594416 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.600268 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r97dm" event={"ID":"481d8815-c8ec-4eeb-aad1-bb28f7161829","Type":"ContainerStarted","Data":"179a290e11ba989f9e7d5abcf33d2c557dc0b52bf97756e917afbd70d4feb2c9"} Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.600319 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r97dm" event={"ID":"481d8815-c8ec-4eeb-aad1-bb28f7161829","Type":"ContainerStarted","Data":"8f4fc2f9be2557e8319873b9bc42670de3390c4618dd6edde18fbfb1e95ffd0e"} Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.615413 4852 scope.go:117] "RemoveContainer" containerID="aa7fd083008d9cb0d131aea641f785353c7942b80c415c6ae5dfd44dc153b228" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.636890 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.849132994 podStartE2EDuration="6.636870316s" podCreationTimestamp="2025-12-10 12:16:32 +0000 UTC" firstStartedPulling="2025-12-10 12:16:33.303397418 +0000 UTC m=+1479.388922642" lastFinishedPulling="2025-12-10 12:16:38.09113474 +0000 UTC m=+1484.176659964" observedRunningTime="2025-12-10 12:16:38.622119278 +0000 UTC m=+1484.707644502" watchObservedRunningTime="2025-12-10 12:16:38.636870316 +0000 UTC m=+1484.722395540" Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.650915 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-x7855"] Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.661223 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-x7855"] Dec 10 12:16:38 crc kubenswrapper[4852]: I1210 12:16:38.665132 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-r97dm" podStartSLOduration=2.6651155920000003 podStartE2EDuration="2.665115592s" podCreationTimestamp="2025-12-10 12:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:38.653106132 +0000 UTC m=+1484.738631366" watchObservedRunningTime="2025-12-10 12:16:38.665115592 +0000 UTC m=+1484.750640816" Dec 10 12:16:40 crc kubenswrapper[4852]: I1210 12:16:40.183389 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e127297c-061e-457a-9c2a-5794a1f39a3a" path="/var/lib/kubelet/pods/e127297c-061e-457a-9c2a-5794a1f39a3a/volumes" Dec 10 12:16:43 crc kubenswrapper[4852]: I1210 12:16:43.649944 4852 generic.go:334] "Generic (PLEG): container finished" podID="481d8815-c8ec-4eeb-aad1-bb28f7161829" containerID="179a290e11ba989f9e7d5abcf33d2c557dc0b52bf97756e917afbd70d4feb2c9" exitCode=0 Dec 10 12:16:43 crc kubenswrapper[4852]: I1210 12:16:43.650079 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r97dm" event={"ID":"481d8815-c8ec-4eeb-aad1-bb28f7161829","Type":"ContainerDied","Data":"179a290e11ba989f9e7d5abcf33d2c557dc0b52bf97756e917afbd70d4feb2c9"} Dec 10 12:16:44 crc kubenswrapper[4852]: I1210 12:16:44.193020 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 10 12:16:44 crc kubenswrapper[4852]: I1210 12:16:44.193075 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.034725 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.097663 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-scripts\") pod \"481d8815-c8ec-4eeb-aad1-bb28f7161829\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.097885 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-config-data\") pod \"481d8815-c8ec-4eeb-aad1-bb28f7161829\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.097952 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-combined-ca-bundle\") pod \"481d8815-c8ec-4eeb-aad1-bb28f7161829\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.097997 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjzqj\" (UniqueName: \"kubernetes.io/projected/481d8815-c8ec-4eeb-aad1-bb28f7161829-kube-api-access-mjzqj\") pod \"481d8815-c8ec-4eeb-aad1-bb28f7161829\" (UID: \"481d8815-c8ec-4eeb-aad1-bb28f7161829\") " Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.104611 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-scripts" (OuterVolumeSpecName: "scripts") pod "481d8815-c8ec-4eeb-aad1-bb28f7161829" (UID: "481d8815-c8ec-4eeb-aad1-bb28f7161829"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.105317 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481d8815-c8ec-4eeb-aad1-bb28f7161829-kube-api-access-mjzqj" (OuterVolumeSpecName: "kube-api-access-mjzqj") pod "481d8815-c8ec-4eeb-aad1-bb28f7161829" (UID: "481d8815-c8ec-4eeb-aad1-bb28f7161829"). InnerVolumeSpecName "kube-api-access-mjzqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.132837 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "481d8815-c8ec-4eeb-aad1-bb28f7161829" (UID: "481d8815-c8ec-4eeb-aad1-bb28f7161829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.139364 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-config-data" (OuterVolumeSpecName: "config-data") pod "481d8815-c8ec-4eeb-aad1-bb28f7161829" (UID: "481d8815-c8ec-4eeb-aad1-bb28f7161829"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.200176 4852 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-scripts\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.200212 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.200221 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481d8815-c8ec-4eeb-aad1-bb28f7161829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.200249 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjzqj\" (UniqueName: \"kubernetes.io/projected/481d8815-c8ec-4eeb-aad1-bb28f7161829-kube-api-access-mjzqj\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.205468 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.205483 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.671485 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r97dm" event={"ID":"481d8815-c8ec-4eeb-aad1-bb28f7161829","Type":"ContainerDied","Data":"8f4fc2f9be2557e8319873b9bc42670de3390c4618dd6edde18fbfb1e95ffd0e"} Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.671531 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f4fc2f9be2557e8319873b9bc42670de3390c4618dd6edde18fbfb1e95ffd0e" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.671583 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r97dm" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.790357 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.790682 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.790881 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.791853 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c62f223218c8d67bf458bba29b25f48f874ad6d23f1af6c44094e9bc123c137"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.791999 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://3c62f223218c8d67bf458bba29b25f48f874ad6d23f1af6c44094e9bc123c137" gracePeriod=600 Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.873602 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.873835 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-log" containerID="cri-o://5bc45a10e41ffb6cd0f3175e59b83e6eef74c5d06c4c7e9820d0d98800596681" gracePeriod=30 Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.874187 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-api" containerID="cri-o://1ffe03c76992cec29c6f69e23aa99890b376e1e78b78461f84806272db8dccb7" gracePeriod=30 Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.886798 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.886994 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e24574d5-d688-42bc-b424-0fca36afa981" containerName="nova-scheduler-scheduler" containerID="cri-o://fc8d70481fb0b61bba279033a8a92c3734ad340a913236591c8a59a160bc71b4" gracePeriod=30 Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.905756 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.910414 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-log" containerID="cri-o://efbaa538454e2216e36b1c7df26827ccedf45f70ea51517fdad3aad2e8dc6529" gracePeriod=30 Dec 10 12:16:45 crc kubenswrapper[4852]: I1210 12:16:45.910598 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-metadata" containerID="cri-o://a307ddae9c258eee8dbef34b1d4e8b51e77a37d9486301302423efda57f555c9" gracePeriod=30 Dec 10 12:16:46 crc kubenswrapper[4852]: E1210 12:16:46.626045 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc8d70481fb0b61bba279033a8a92c3734ad340a913236591c8a59a160bc71b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 10 12:16:46 crc kubenswrapper[4852]: E1210 12:16:46.628579 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc8d70481fb0b61bba279033a8a92c3734ad340a913236591c8a59a160bc71b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 10 12:16:46 crc kubenswrapper[4852]: E1210 12:16:46.632441 4852 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc8d70481fb0b61bba279033a8a92c3734ad340a913236591c8a59a160bc71b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 10 12:16:46 crc kubenswrapper[4852]: E1210 12:16:46.632536 4852 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e24574d5-d688-42bc-b424-0fca36afa981" containerName="nova-scheduler-scheduler" Dec 10 12:16:46 crc kubenswrapper[4852]: I1210 12:16:46.682054 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="3c62f223218c8d67bf458bba29b25f48f874ad6d23f1af6c44094e9bc123c137" exitCode=0 Dec 10 12:16:46 crc kubenswrapper[4852]: I1210 12:16:46.682109 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"3c62f223218c8d67bf458bba29b25f48f874ad6d23f1af6c44094e9bc123c137"} Dec 10 12:16:46 crc kubenswrapper[4852]: I1210 12:16:46.682153 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559"} Dec 10 12:16:46 crc kubenswrapper[4852]: I1210 12:16:46.682168 4852 scope.go:117] "RemoveContainer" containerID="15e58a6d5758dde8e8be6570ea8629914b8054e6378a86d3d8b1552b7be80d78" Dec 10 12:16:46 crc kubenswrapper[4852]: I1210 12:16:46.683893 4852 generic.go:334] "Generic (PLEG): container finished" podID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerID="5bc45a10e41ffb6cd0f3175e59b83e6eef74c5d06c4c7e9820d0d98800596681" exitCode=143 Dec 10 12:16:46 crc kubenswrapper[4852]: I1210 12:16:46.683963 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83","Type":"ContainerDied","Data":"5bc45a10e41ffb6cd0f3175e59b83e6eef74c5d06c4c7e9820d0d98800596681"} Dec 10 12:16:46 crc kubenswrapper[4852]: I1210 12:16:46.685574 4852 generic.go:334] "Generic (PLEG): container finished" podID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerID="efbaa538454e2216e36b1c7df26827ccedf45f70ea51517fdad3aad2e8dc6529" exitCode=143 Dec 10 12:16:46 crc kubenswrapper[4852]: I1210 12:16:46.685602 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c899c38-c8c0-4524-9bb5-ec72cd80c806","Type":"ContainerDied","Data":"efbaa538454e2216e36b1c7df26827ccedf45f70ea51517fdad3aad2e8dc6529"} Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.697254 4852 generic.go:334] "Generic (PLEG): container finished" podID="e24574d5-d688-42bc-b424-0fca36afa981" containerID="fc8d70481fb0b61bba279033a8a92c3734ad340a913236591c8a59a160bc71b4" exitCode=0 Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.697833 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e24574d5-d688-42bc-b424-0fca36afa981","Type":"ContainerDied","Data":"fc8d70481fb0b61bba279033a8a92c3734ad340a913236591c8a59a160bc71b4"} Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.698451 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e24574d5-d688-42bc-b424-0fca36afa981","Type":"ContainerDied","Data":"7d3ceda2a3e5e7011daff810a2b9cb524cfd9c5249f58c528cfff1a93319668e"} Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.698473 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3ceda2a3e5e7011daff810a2b9cb524cfd9c5249f58c528cfff1a93319668e" Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.774436 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.857037 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-combined-ca-bundle\") pod \"e24574d5-d688-42bc-b424-0fca36afa981\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.857150 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pn25\" (UniqueName: \"kubernetes.io/projected/e24574d5-d688-42bc-b424-0fca36afa981-kube-api-access-6pn25\") pod \"e24574d5-d688-42bc-b424-0fca36afa981\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.857212 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-config-data\") pod \"e24574d5-d688-42bc-b424-0fca36afa981\" (UID: \"e24574d5-d688-42bc-b424-0fca36afa981\") " Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.862294 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24574d5-d688-42bc-b424-0fca36afa981-kube-api-access-6pn25" (OuterVolumeSpecName: "kube-api-access-6pn25") pod "e24574d5-d688-42bc-b424-0fca36afa981" (UID: "e24574d5-d688-42bc-b424-0fca36afa981"). InnerVolumeSpecName "kube-api-access-6pn25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.887305 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e24574d5-d688-42bc-b424-0fca36afa981" (UID: "e24574d5-d688-42bc-b424-0fca36afa981"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.903573 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-config-data" (OuterVolumeSpecName: "config-data") pod "e24574d5-d688-42bc-b424-0fca36afa981" (UID: "e24574d5-d688-42bc-b424-0fca36afa981"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.959500 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.959535 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pn25\" (UniqueName: \"kubernetes.io/projected/e24574d5-d688-42bc-b424-0fca36afa981-kube-api-access-6pn25\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:47 crc kubenswrapper[4852]: I1210 12:16:47.959546 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24574d5-d688-42bc-b424-0fca36afa981-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.710286 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.737262 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.746835 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.761911 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:48 crc kubenswrapper[4852]: E1210 12:16:48.762757 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481d8815-c8ec-4eeb-aad1-bb28f7161829" containerName="nova-manage" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.762887 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="481d8815-c8ec-4eeb-aad1-bb28f7161829" containerName="nova-manage" Dec 10 12:16:48 crc kubenswrapper[4852]: E1210 12:16:48.763984 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerName="init" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.764076 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerName="init" Dec 10 12:16:48 crc kubenswrapper[4852]: E1210 12:16:48.764178 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24574d5-d688-42bc-b424-0fca36afa981" containerName="nova-scheduler-scheduler" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.764281 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24574d5-d688-42bc-b424-0fca36afa981" containerName="nova-scheduler-scheduler" Dec 10 12:16:48 crc kubenswrapper[4852]: E1210 12:16:48.764404 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerName="dnsmasq-dns" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.764486 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerName="dnsmasq-dns" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.764801 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24574d5-d688-42bc-b424-0fca36afa981" containerName="nova-scheduler-scheduler" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.764897 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e127297c-061e-457a-9c2a-5794a1f39a3a" containerName="dnsmasq-dns" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.765044 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="481d8815-c8ec-4eeb-aad1-bb28f7161829" containerName="nova-manage" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.766201 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.768806 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.772707 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.908815 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9065a2ec-b14d-4376-87f7-2305a86dec0c-config-data\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.908899 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9065a2ec-b14d-4376-87f7-2305a86dec0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:48 crc kubenswrapper[4852]: I1210 12:16:48.908933 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d86d2\" (UniqueName: \"kubernetes.io/projected/9065a2ec-b14d-4376-87f7-2305a86dec0c-kube-api-access-d86d2\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.011032 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9065a2ec-b14d-4376-87f7-2305a86dec0c-config-data\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.011168 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9065a2ec-b14d-4376-87f7-2305a86dec0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.011214 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d86d2\" (UniqueName: \"kubernetes.io/projected/9065a2ec-b14d-4376-87f7-2305a86dec0c-kube-api-access-d86d2\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.018195 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9065a2ec-b14d-4376-87f7-2305a86dec0c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.019124 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9065a2ec-b14d-4376-87f7-2305a86dec0c-config-data\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.035552 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d86d2\" (UniqueName: \"kubernetes.io/projected/9065a2ec-b14d-4376-87f7-2305a86dec0c-kube-api-access-d86d2\") pod \"nova-scheduler-0\" (UID: \"9065a2ec-b14d-4376-87f7-2305a86dec0c\") " pod="openstack/nova-scheduler-0" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.126166 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.454028 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": dial tcp 10.217.0.195:8775: connect: connection refused" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.454028 4852 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": dial tcp 10.217.0.195:8775: connect: connection refused" Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.595768 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 10 12:16:49 crc kubenswrapper[4852]: W1210 12:16:49.605140 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9065a2ec_b14d_4376_87f7_2305a86dec0c.slice/crio-b4546590f309fae1c4604cba42ec4b600757178837db14143d820b8d64ecd082 WatchSource:0}: Error finding container b4546590f309fae1c4604cba42ec4b600757178837db14143d820b8d64ecd082: Status 404 returned error can't find the container with id b4546590f309fae1c4604cba42ec4b600757178837db14143d820b8d64ecd082 Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.720564 4852 generic.go:334] "Generic (PLEG): container finished" podID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerID="a307ddae9c258eee8dbef34b1d4e8b51e77a37d9486301302423efda57f555c9" exitCode=0 Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.720913 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c899c38-c8c0-4524-9bb5-ec72cd80c806","Type":"ContainerDied","Data":"a307ddae9c258eee8dbef34b1d4e8b51e77a37d9486301302423efda57f555c9"} Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.722253 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9065a2ec-b14d-4376-87f7-2305a86dec0c","Type":"ContainerStarted","Data":"b4546590f309fae1c4604cba42ec4b600757178837db14143d820b8d64ecd082"} Dec 10 12:16:49 crc kubenswrapper[4852]: I1210 12:16:49.924027 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.030351 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-combined-ca-bundle\") pod \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.030423 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/4c899c38-c8c0-4524-9bb5-ec72cd80c806-kube-api-access-226hp\") pod \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.030471 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c899c38-c8c0-4524-9bb5-ec72cd80c806-logs\") pod \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.030647 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-config-data\") pod \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.030712 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-nova-metadata-tls-certs\") pod \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\" (UID: \"4c899c38-c8c0-4524-9bb5-ec72cd80c806\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.032503 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c899c38-c8c0-4524-9bb5-ec72cd80c806-logs" (OuterVolumeSpecName: "logs") pod "4c899c38-c8c0-4524-9bb5-ec72cd80c806" (UID: "4c899c38-c8c0-4524-9bb5-ec72cd80c806"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.045466 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c899c38-c8c0-4524-9bb5-ec72cd80c806-kube-api-access-226hp" (OuterVolumeSpecName: "kube-api-access-226hp") pod "4c899c38-c8c0-4524-9bb5-ec72cd80c806" (UID: "4c899c38-c8c0-4524-9bb5-ec72cd80c806"). InnerVolumeSpecName "kube-api-access-226hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.063577 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-config-data" (OuterVolumeSpecName: "config-data") pod "4c899c38-c8c0-4524-9bb5-ec72cd80c806" (UID: "4c899c38-c8c0-4524-9bb5-ec72cd80c806"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.072716 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c899c38-c8c0-4524-9bb5-ec72cd80c806" (UID: "4c899c38-c8c0-4524-9bb5-ec72cd80c806"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.106471 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4c899c38-c8c0-4524-9bb5-ec72cd80c806" (UID: "4c899c38-c8c0-4524-9bb5-ec72cd80c806"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.132591 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.132621 4852 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.132633 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c899c38-c8c0-4524-9bb5-ec72cd80c806-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.132644 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/4c899c38-c8c0-4524-9bb5-ec72cd80c806-kube-api-access-226hp\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.132653 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c899c38-c8c0-4524-9bb5-ec72cd80c806-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.189850 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24574d5-d688-42bc-b424-0fca36afa981" path="/var/lib/kubelet/pods/e24574d5-d688-42bc-b424-0fca36afa981/volumes" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.732178 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c899c38-c8c0-4524-9bb5-ec72cd80c806","Type":"ContainerDied","Data":"1ec4d1b1d7b13810a79b1dbbfb9868cf36662344568c062880a17c7d92c3bd3b"} Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.732592 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.734319 4852 scope.go:117] "RemoveContainer" containerID="a307ddae9c258eee8dbef34b1d4e8b51e77a37d9486301302423efda57f555c9" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.735790 4852 generic.go:334] "Generic (PLEG): container finished" podID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerID="1ffe03c76992cec29c6f69e23aa99890b376e1e78b78461f84806272db8dccb7" exitCode=0 Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.735852 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83","Type":"ContainerDied","Data":"1ffe03c76992cec29c6f69e23aa99890b376e1e78b78461f84806272db8dccb7"} Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.735882 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83","Type":"ContainerDied","Data":"a6d6fd39c948393ff36ca56279b6b8e8c5e3e7145c855d29ffaf0b2582c22546"} Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.735895 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d6fd39c948393ff36ca56279b6b8e8c5e3e7145c855d29ffaf0b2582c22546" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.737953 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9065a2ec-b14d-4376-87f7-2305a86dec0c","Type":"ContainerStarted","Data":"5ef0d141d62cfba3b071253b6862e703ed416201503628b14af55502e7307cd4"} Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.761766 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.761747493 podStartE2EDuration="2.761747493s" podCreationTimestamp="2025-12-10 12:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:50.755857095 +0000 UTC m=+1496.841382329" watchObservedRunningTime="2025-12-10 12:16:50.761747493 +0000 UTC m=+1496.847272717" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.785858 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.800992 4852 scope.go:117] "RemoveContainer" containerID="efbaa538454e2216e36b1c7df26827ccedf45f70ea51517fdad3aad2e8dc6529" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.813827 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.831477 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.851438 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:50 crc kubenswrapper[4852]: E1210 12:16:50.852110 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-api" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.852192 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-api" Dec 10 12:16:50 crc kubenswrapper[4852]: E1210 12:16:50.852333 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-log" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.852402 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-log" Dec 10 12:16:50 crc kubenswrapper[4852]: E1210 12:16:50.852461 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-metadata" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.852517 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-metadata" Dec 10 12:16:50 crc kubenswrapper[4852]: E1210 12:16:50.852586 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-log" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.852647 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-log" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.852916 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-log" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.853027 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" containerName="nova-api-api" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.853103 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-metadata" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.853166 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" containerName="nova-metadata-log" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.854418 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.856620 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.857501 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.861454 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.945378 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl4m4\" (UniqueName: \"kubernetes.io/projected/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-kube-api-access-zl4m4\") pod \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.945685 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-public-tls-certs\") pod \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.945764 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-internal-tls-certs\") pod \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.945830 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-combined-ca-bundle\") pod \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.945857 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-config-data\") pod \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.946057 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-logs\") pod \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\" (UID: \"e9b8c1c5-76a2-4c98-bb1b-589b0863ce83\") " Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.946350 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.946389 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-config-data\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.946703 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-logs" (OuterVolumeSpecName: "logs") pod "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" (UID: "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.946838 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-logs\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.947179 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.947425 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbncx\" (UniqueName: \"kubernetes.io/projected/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-kube-api-access-nbncx\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.947573 4852 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-logs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.951506 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-kube-api-access-zl4m4" (OuterVolumeSpecName: "kube-api-access-zl4m4") pod "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" (UID: "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83"). InnerVolumeSpecName "kube-api-access-zl4m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.977977 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" (UID: "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:50 crc kubenswrapper[4852]: I1210 12:16:50.998686 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-config-data" (OuterVolumeSpecName: "config-data") pod "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" (UID: "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.012191 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" (UID: "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.033013 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" (UID: "e9b8c1c5-76a2-4c98-bb1b-589b0863ce83"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049559 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049635 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbncx\" (UniqueName: \"kubernetes.io/projected/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-kube-api-access-nbncx\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049678 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049698 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-config-data\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049738 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-logs\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049779 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl4m4\" (UniqueName: \"kubernetes.io/projected/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-kube-api-access-zl4m4\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049798 4852 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049813 4852 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049824 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.049835 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.050104 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-logs\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.052878 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.053584 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-config-data\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.054902 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.066785 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbncx\" (UniqueName: \"kubernetes.io/projected/2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3-kube-api-access-nbncx\") pod \"nova-metadata-0\" (UID: \"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3\") " pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.174390 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 10 12:16:51 crc kubenswrapper[4852]: W1210 12:16:51.619420 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fdca3d6_4ed5_4e7b_995d_c6c8e40739a3.slice/crio-995d7278c1a5043c550f6aebdaa9b615fa1c9c55aeccd83cff76ce8e346a5fff WatchSource:0}: Error finding container 995d7278c1a5043c550f6aebdaa9b615fa1c9c55aeccd83cff76ce8e346a5fff: Status 404 returned error can't find the container with id 995d7278c1a5043c550f6aebdaa9b615fa1c9c55aeccd83cff76ce8e346a5fff Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.627562 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.747258 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3","Type":"ContainerStarted","Data":"995d7278c1a5043c550f6aebdaa9b615fa1c9c55aeccd83cff76ce8e346a5fff"} Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.748347 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.794977 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.810237 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.840380 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.841988 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.844542 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.844900 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.845124 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.855513 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.969139 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cebd010-0435-40cc-9d60-2359682ee83e-logs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.969571 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-config-data\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.969600 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.969657 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-public-tls-certs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.969693 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:51 crc kubenswrapper[4852]: I1210 12:16:51.969765 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvg9r\" (UniqueName: \"kubernetes.io/projected/9cebd010-0435-40cc-9d60-2359682ee83e-kube-api-access-pvg9r\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.071896 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cebd010-0435-40cc-9d60-2359682ee83e-logs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.071983 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-config-data\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.072003 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.072040 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-public-tls-certs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.072066 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.072090 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvg9r\" (UniqueName: \"kubernetes.io/projected/9cebd010-0435-40cc-9d60-2359682ee83e-kube-api-access-pvg9r\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.072874 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cebd010-0435-40cc-9d60-2359682ee83e-logs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.076990 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-public-tls-certs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.078546 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.078905 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.078917 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cebd010-0435-40cc-9d60-2359682ee83e-config-data\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.088058 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvg9r\" (UniqueName: \"kubernetes.io/projected/9cebd010-0435-40cc-9d60-2359682ee83e-kube-api-access-pvg9r\") pod \"nova-api-0\" (UID: \"9cebd010-0435-40cc-9d60-2359682ee83e\") " pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.163076 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.181050 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c899c38-c8c0-4524-9bb5-ec72cd80c806" path="/var/lib/kubelet/pods/4c899c38-c8c0-4524-9bb5-ec72cd80c806/volumes" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.181745 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b8c1c5-76a2-4c98-bb1b-589b0863ce83" path="/var/lib/kubelet/pods/e9b8c1c5-76a2-4c98-bb1b-589b0863ce83/volumes" Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.623419 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 10 12:16:52 crc kubenswrapper[4852]: W1210 12:16:52.630064 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cebd010_0435_40cc_9d60_2359682ee83e.slice/crio-ad6a4e5eed092abb0d0d756bdbd791219c6b753af3b26298cfa70da0c5e264af WatchSource:0}: Error finding container ad6a4e5eed092abb0d0d756bdbd791219c6b753af3b26298cfa70da0c5e264af: Status 404 returned error can't find the container with id ad6a4e5eed092abb0d0d756bdbd791219c6b753af3b26298cfa70da0c5e264af Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.760459 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3","Type":"ContainerStarted","Data":"1cdc98138d66bd21a306e1d68939ceff7371e3bc1237845692435e5a96dcb3c7"} Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.760504 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3","Type":"ContainerStarted","Data":"c38d35eb9d1c2e20d1d05e0a29b69f11663adbe649f686045b38818d2392ce2e"} Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.761853 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cebd010-0435-40cc-9d60-2359682ee83e","Type":"ContainerStarted","Data":"ad6a4e5eed092abb0d0d756bdbd791219c6b753af3b26298cfa70da0c5e264af"} Dec 10 12:16:52 crc kubenswrapper[4852]: I1210 12:16:52.787422 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.787386257 podStartE2EDuration="2.787386257s" podCreationTimestamp="2025-12-10 12:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:52.777974972 +0000 UTC m=+1498.863500196" watchObservedRunningTime="2025-12-10 12:16:52.787386257 +0000 UTC m=+1498.872911491" Dec 10 12:16:53 crc kubenswrapper[4852]: I1210 12:16:53.787967 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cebd010-0435-40cc-9d60-2359682ee83e","Type":"ContainerStarted","Data":"a8d19cbd0a96a2979f2c2e6b6d6f6daf92d18962c0a870b2a737c90fa64f3241"} Dec 10 12:16:53 crc kubenswrapper[4852]: I1210 12:16:53.789850 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cebd010-0435-40cc-9d60-2359682ee83e","Type":"ContainerStarted","Data":"e5d8f8215de38c9efa278e588f83a1eee56e74fe5113d7d88095d0f2a3f1b318"} Dec 10 12:16:53 crc kubenswrapper[4852]: I1210 12:16:53.835000 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.834978134 podStartE2EDuration="2.834978134s" podCreationTimestamp="2025-12-10 12:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:16:53.819283022 +0000 UTC m=+1499.904808336" watchObservedRunningTime="2025-12-10 12:16:53.834978134 +0000 UTC m=+1499.920503368" Dec 10 12:16:54 crc kubenswrapper[4852]: I1210 12:16:54.127279 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 10 12:16:56 crc kubenswrapper[4852]: I1210 12:16:56.180383 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 10 12:16:56 crc kubenswrapper[4852]: I1210 12:16:56.181448 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 10 12:16:59 crc kubenswrapper[4852]: I1210 12:16:59.127138 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 10 12:16:59 crc kubenswrapper[4852]: I1210 12:16:59.152463 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 10 12:16:59 crc kubenswrapper[4852]: I1210 12:16:59.875196 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 10 12:17:01 crc kubenswrapper[4852]: I1210 12:17:01.175592 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 10 12:17:01 crc kubenswrapper[4852]: I1210 12:17:01.176651 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 10 12:17:02 crc kubenswrapper[4852]: I1210 12:17:02.164952 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 10 12:17:02 crc kubenswrapper[4852]: I1210 12:17:02.165219 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 10 12:17:02 crc kubenswrapper[4852]: I1210 12:17:02.191535 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 10 12:17:02 crc kubenswrapper[4852]: I1210 12:17:02.191616 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 10 12:17:02 crc kubenswrapper[4852]: I1210 12:17:02.833970 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 10 12:17:03 crc kubenswrapper[4852]: I1210 12:17:03.178409 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9cebd010-0435-40cc-9d60-2359682ee83e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 10 12:17:03 crc kubenswrapper[4852]: I1210 12:17:03.178443 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9cebd010-0435-40cc-9d60-2359682ee83e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 10 12:17:11 crc kubenswrapper[4852]: I1210 12:17:11.182011 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 10 12:17:11 crc kubenswrapper[4852]: I1210 12:17:11.183281 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 10 12:17:11 crc kubenswrapper[4852]: I1210 12:17:11.189971 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 10 12:17:12 crc kubenswrapper[4852]: I1210 12:17:12.018126 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 10 12:17:12 crc kubenswrapper[4852]: I1210 12:17:12.184787 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 10 12:17:12 crc kubenswrapper[4852]: I1210 12:17:12.185130 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 10 12:17:12 crc kubenswrapper[4852]: I1210 12:17:12.185896 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 10 12:17:12 crc kubenswrapper[4852]: I1210 12:17:12.186254 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 10 12:17:12 crc kubenswrapper[4852]: I1210 12:17:12.194704 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 10 12:17:12 crc kubenswrapper[4852]: I1210 12:17:12.194823 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 10 12:17:20 crc kubenswrapper[4852]: I1210 12:17:20.111637 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:17:21 crc kubenswrapper[4852]: I1210 12:17:21.241059 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:17:24 crc kubenswrapper[4852]: I1210 12:17:24.289750 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="09a9edae-3cd0-4f71-ba18-9800a7baefef" containerName="rabbitmq" containerID="cri-o://09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac" gracePeriod=604796 Dec 10 12:17:25 crc kubenswrapper[4852]: I1210 12:17:25.070045 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="15a1ed1e-209b-4c71-b15f-44caaec70e93" containerName="rabbitmq" containerID="cri-o://c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0" gracePeriod=604797 Dec 10 12:17:30 crc kubenswrapper[4852]: I1210 12:17:30.917082 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.053760 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-tls\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.053853 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-plugins-conf\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054010 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-server-conf\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054051 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmdv5\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-kube-api-access-tmdv5\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054089 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054129 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09a9edae-3cd0-4f71-ba18-9800a7baefef-pod-info\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054181 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-config-data\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054273 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-confd\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054319 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-erlang-cookie\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054356 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09a9edae-3cd0-4f71-ba18-9800a7baefef-erlang-cookie-secret\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.054395 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-plugins\") pod \"09a9edae-3cd0-4f71-ba18-9800a7baefef\" (UID: \"09a9edae-3cd0-4f71-ba18-9800a7baefef\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.056000 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.057052 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.059807 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.061427 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.064503 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/09a9edae-3cd0-4f71-ba18-9800a7baefef-pod-info" (OuterVolumeSpecName: "pod-info") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.064572 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-kube-api-access-tmdv5" (OuterVolumeSpecName: "kube-api-access-tmdv5") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "kube-api-access-tmdv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.073574 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.079523 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a9edae-3cd0-4f71-ba18-9800a7baefef-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.107302 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-config-data" (OuterVolumeSpecName: "config-data") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156583 4852 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156623 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmdv5\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-kube-api-access-tmdv5\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156670 4852 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156684 4852 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09a9edae-3cd0-4f71-ba18-9800a7baefef-pod-info\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156696 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156710 4852 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156720 4852 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09a9edae-3cd0-4f71-ba18-9800a7baefef-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156730 4852 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.156741 4852 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.182369 4852 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.189499 4852 generic.go:334] "Generic (PLEG): container finished" podID="09a9edae-3cd0-4f71-ba18-9800a7baefef" containerID="09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac" exitCode=0 Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.189538 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09a9edae-3cd0-4f71-ba18-9800a7baefef","Type":"ContainerDied","Data":"09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac"} Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.189562 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09a9edae-3cd0-4f71-ba18-9800a7baefef","Type":"ContainerDied","Data":"983fa8f7b86f529a0b5d318b2f39aa03f4a6aadf75435ce116221e4854a8228c"} Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.189577 4852 scope.go:117] "RemoveContainer" containerID="09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.189711 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.195101 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-server-conf" (OuterVolumeSpecName: "server-conf") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.225078 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "09a9edae-3cd0-4f71-ba18-9800a7baefef" (UID: "09a9edae-3cd0-4f71-ba18-9800a7baefef"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.262734 4852 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09a9edae-3cd0-4f71-ba18-9800a7baefef-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.262771 4852 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09a9edae-3cd0-4f71-ba18-9800a7baefef-server-conf\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.262781 4852 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.276539 4852 scope.go:117] "RemoveContainer" containerID="18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.351435 4852 scope.go:117] "RemoveContainer" containerID="09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac" Dec 10 12:17:31 crc kubenswrapper[4852]: E1210 12:17:31.351831 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac\": container with ID starting with 09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac not found: ID does not exist" containerID="09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.351864 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac"} err="failed to get container status \"09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac\": rpc error: code = NotFound desc = could not find container \"09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac\": container with ID starting with 09dee223514f7d7758a500f3d0f83c3e6f2927b47773c1c9d17ccaa4516032ac not found: ID does not exist" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.351887 4852 scope.go:117] "RemoveContainer" containerID="18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d" Dec 10 12:17:31 crc kubenswrapper[4852]: E1210 12:17:31.352156 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d\": container with ID starting with 18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d not found: ID does not exist" containerID="18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.352176 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d"} err="failed to get container status \"18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d\": rpc error: code = NotFound desc = could not find container \"18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d\": container with ID starting with 18599df88226bd241f6435ac8ab7aad04efd0a14f722468cb1f5ca50be06000d not found: ID does not exist" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.596881 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.629387 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.650543 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:17:31 crc kubenswrapper[4852]: E1210 12:17:31.651072 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a9edae-3cd0-4f71-ba18-9800a7baefef" containerName="rabbitmq" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.651109 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a9edae-3cd0-4f71-ba18-9800a7baefef" containerName="rabbitmq" Dec 10 12:17:31 crc kubenswrapper[4852]: E1210 12:17:31.651135 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a9edae-3cd0-4f71-ba18-9800a7baefef" containerName="setup-container" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.651144 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a9edae-3cd0-4f71-ba18-9800a7baefef" containerName="setup-container" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.651380 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a9edae-3cd0-4f71-ba18-9800a7baefef" containerName="rabbitmq" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.652585 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.654319 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.658794 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-c8894" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.659000 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.659645 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.659783 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.659951 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.660049 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.676435 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.727294 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784196 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-confd\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784291 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784334 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-plugins\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784555 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-config-data\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784614 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15a1ed1e-209b-4c71-b15f-44caaec70e93-erlang-cookie-secret\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784662 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15a1ed1e-209b-4c71-b15f-44caaec70e93-pod-info\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784785 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-plugins-conf\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784913 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-server-conf\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784936 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-tls\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784965 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdlxh\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-kube-api-access-kdlxh\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.784990 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-erlang-cookie\") pod \"15a1ed1e-209b-4c71-b15f-44caaec70e93\" (UID: \"15a1ed1e-209b-4c71-b15f-44caaec70e93\") " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785180 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785255 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfz5k\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-kube-api-access-wfz5k\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785325 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-server-conf\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785366 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785404 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/280ccc25-3ba2-46ea-b167-19480cb76a48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785443 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785613 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785646 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785679 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-config-data\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785706 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785724 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785754 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/280ccc25-3ba2-46ea-b167-19480cb76a48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.785809 4852 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.786979 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.787008 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.789725 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/15a1ed1e-209b-4c71-b15f-44caaec70e93-pod-info" (OuterVolumeSpecName: "pod-info") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.791534 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-kube-api-access-kdlxh" (OuterVolumeSpecName: "kube-api-access-kdlxh") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "kube-api-access-kdlxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.791617 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.805338 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a1ed1e-209b-4c71-b15f-44caaec70e93-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.806164 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.850877 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-config-data" (OuterVolumeSpecName: "config-data") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.889834 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/280ccc25-3ba2-46ea-b167-19480cb76a48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.890119 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.891084 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.891457 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.890261 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.891951 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.892042 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-config-data\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.892110 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.892149 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.892196 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/280ccc25-3ba2-46ea-b167-19480cb76a48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.892263 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfz5k\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-kube-api-access-wfz5k\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.890120 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-server-conf" (OuterVolumeSpecName: "server-conf") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.892378 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-server-conf\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.903294 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/280ccc25-3ba2-46ea-b167-19480cb76a48-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.904178 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.905872 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908687 4852 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15a1ed1e-209b-4c71-b15f-44caaec70e93-pod-info\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908724 4852 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908736 4852 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-server-conf\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908747 4852 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908760 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdlxh\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-kube-api-access-kdlxh\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908771 4852 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908805 4852 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908820 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15a1ed1e-209b-4c71-b15f-44caaec70e93-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.908833 4852 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15a1ed1e-209b-4c71-b15f-44caaec70e93-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.912956 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.913201 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.913446 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-server-conf\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.913879 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/280ccc25-3ba2-46ea-b167-19480cb76a48-pod-info\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.914866 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/280ccc25-3ba2-46ea-b167-19480cb76a48-config-data\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.917493 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.928160 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfz5k\" (UniqueName: \"kubernetes.io/projected/280ccc25-3ba2-46ea-b167-19480cb76a48-kube-api-access-wfz5k\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.930395 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"280ccc25-3ba2-46ea-b167-19480cb76a48\") " pod="openstack/rabbitmq-server-0" Dec 10 12:17:31 crc kubenswrapper[4852]: I1210 12:17:31.931812 4852 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.010586 4852 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.028353 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "15a1ed1e-209b-4c71-b15f-44caaec70e93" (UID: "15a1ed1e-209b-4c71-b15f-44caaec70e93"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.031969 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.113034 4852 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15a1ed1e-209b-4c71-b15f-44caaec70e93-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.181518 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a9edae-3cd0-4f71-ba18-9800a7baefef" path="/var/lib/kubelet/pods/09a9edae-3cd0-4f71-ba18-9800a7baefef/volumes" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.206845 4852 generic.go:334] "Generic (PLEG): container finished" podID="15a1ed1e-209b-4c71-b15f-44caaec70e93" containerID="c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0" exitCode=0 Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.206914 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15a1ed1e-209b-4c71-b15f-44caaec70e93","Type":"ContainerDied","Data":"c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0"} Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.206943 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15a1ed1e-209b-4c71-b15f-44caaec70e93","Type":"ContainerDied","Data":"a17be7876e80540d338945b08f519ab20c337a2ab5d470a78b2f736639715b62"} Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.206960 4852 scope.go:117] "RemoveContainer" containerID="c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.207059 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.241428 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.260542 4852 scope.go:117] "RemoveContainer" containerID="021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.263331 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.287579 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:17:32 crc kubenswrapper[4852]: E1210 12:17:32.293829 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a1ed1e-209b-4c71-b15f-44caaec70e93" containerName="rabbitmq" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.293879 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a1ed1e-209b-4c71-b15f-44caaec70e93" containerName="rabbitmq" Dec 10 12:17:32 crc kubenswrapper[4852]: E1210 12:17:32.293930 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a1ed1e-209b-4c71-b15f-44caaec70e93" containerName="setup-container" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.293944 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a1ed1e-209b-4c71-b15f-44caaec70e93" containerName="setup-container" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.294303 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a1ed1e-209b-4c71-b15f-44caaec70e93" containerName="rabbitmq" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.296149 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.303498 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.303745 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.303929 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9gkv5" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.304059 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.304248 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.304466 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.304614 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.305965 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.327148 4852 scope.go:117] "RemoveContainer" containerID="c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0" Dec 10 12:17:32 crc kubenswrapper[4852]: E1210 12:17:32.328915 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0\": container with ID starting with c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0 not found: ID does not exist" containerID="c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.328946 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0"} err="failed to get container status \"c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0\": rpc error: code = NotFound desc = could not find container \"c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0\": container with ID starting with c3f6f0d265dfbf49c7bc65beef2635221c7ed03d78a883154866495a4f17e8c0 not found: ID does not exist" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.328968 4852 scope.go:117] "RemoveContainer" containerID="021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27" Dec 10 12:17:32 crc kubenswrapper[4852]: E1210 12:17:32.330456 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27\": container with ID starting with 021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27 not found: ID does not exist" containerID="021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.330491 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27"} err="failed to get container status \"021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27\": rpc error: code = NotFound desc = could not find container \"021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27\": container with ID starting with 021587d4484857ec523fba13c90e81424d66375c8ddf79130e1130b3a58aca27 not found: ID does not exist" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419259 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419326 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419349 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419377 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63001b32-e957-4b24-a742-7932191e7598-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419448 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419547 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419583 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419605 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419639 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419693 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63001b32-e957-4b24-a742-7932191e7598-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.419720 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz269\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-kube-api-access-wz269\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.474127 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 10 12:17:32 crc kubenswrapper[4852]: W1210 12:17:32.480259 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280ccc25_3ba2_46ea_b167_19480cb76a48.slice/crio-f2dfef6d318e1f4f2b03b4e971c99ffe8f594c6704277df392072d85ada64031 WatchSource:0}: Error finding container f2dfef6d318e1f4f2b03b4e971c99ffe8f594c6704277df392072d85ada64031: Status 404 returned error can't find the container with id f2dfef6d318e1f4f2b03b4e971c99ffe8f594c6704277df392072d85ada64031 Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521438 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521484 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521524 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521550 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63001b32-e957-4b24-a742-7932191e7598-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521572 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz269\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-kube-api-access-wz269\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521629 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521673 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521696 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521726 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63001b32-e957-4b24-a742-7932191e7598-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521774 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.521844 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.522162 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.522576 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.522708 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.522818 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.523115 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.523921 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63001b32-e957-4b24-a742-7932191e7598-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.525910 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63001b32-e957-4b24-a742-7932191e7598-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.527820 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63001b32-e957-4b24-a742-7932191e7598-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.528192 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.528393 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.553858 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz269\" (UniqueName: \"kubernetes.io/projected/63001b32-e957-4b24-a742-7932191e7598-kube-api-access-wz269\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.583593 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"63001b32-e957-4b24-a742-7932191e7598\") " pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.633608 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:17:32 crc kubenswrapper[4852]: I1210 12:17:32.915827 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 10 12:17:32 crc kubenswrapper[4852]: W1210 12:17:32.916472 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63001b32_e957_4b24_a742_7932191e7598.slice/crio-ec4492d924a43a68f9bc57dbaade1e208da96702d2c83d9715addd794e6bf42f WatchSource:0}: Error finding container ec4492d924a43a68f9bc57dbaade1e208da96702d2c83d9715addd794e6bf42f: Status 404 returned error can't find the container with id ec4492d924a43a68f9bc57dbaade1e208da96702d2c83d9715addd794e6bf42f Dec 10 12:17:33 crc kubenswrapper[4852]: I1210 12:17:33.218976 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"280ccc25-3ba2-46ea-b167-19480cb76a48","Type":"ContainerStarted","Data":"f2dfef6d318e1f4f2b03b4e971c99ffe8f594c6704277df392072d85ada64031"} Dec 10 12:17:33 crc kubenswrapper[4852]: I1210 12:17:33.220559 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"63001b32-e957-4b24-a742-7932191e7598","Type":"ContainerStarted","Data":"ec4492d924a43a68f9bc57dbaade1e208da96702d2c83d9715addd794e6bf42f"} Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.182754 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a1ed1e-209b-4c71-b15f-44caaec70e93" path="/var/lib/kubelet/pods/15a1ed1e-209b-4c71-b15f-44caaec70e93/volumes" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.751302 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-6l4ld"] Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.752905 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.756492 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.760988 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-6l4ld"] Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.881186 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.881259 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vmcr\" (UniqueName: \"kubernetes.io/projected/cd10f498-e095-400a-b2d3-8f702ec64eec-kube-api-access-9vmcr\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.881309 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.881610 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-svc\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.881806 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.881852 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-config\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.881932 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.983783 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.983841 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-config\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.983913 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.983962 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.983989 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vmcr\" (UniqueName: \"kubernetes.io/projected/cd10f498-e095-400a-b2d3-8f702ec64eec-kube-api-access-9vmcr\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.984025 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.984115 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-svc\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.984789 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.984849 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-config\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.984849 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.985260 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.985329 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:34 crc kubenswrapper[4852]: I1210 12:17:34.985897 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-svc\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:35 crc kubenswrapper[4852]: I1210 12:17:35.004906 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vmcr\" (UniqueName: \"kubernetes.io/projected/cd10f498-e095-400a-b2d3-8f702ec64eec-kube-api-access-9vmcr\") pod \"dnsmasq-dns-d558885bc-6l4ld\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:35 crc kubenswrapper[4852]: I1210 12:17:35.087280 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:35 crc kubenswrapper[4852]: I1210 12:17:35.244113 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"63001b32-e957-4b24-a742-7932191e7598","Type":"ContainerStarted","Data":"5404e488e4f3bee4bf2884a29f3f8d831b9aceb9af47fc7b7424c607cf87ce24"} Dec 10 12:17:35 crc kubenswrapper[4852]: I1210 12:17:35.265159 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"280ccc25-3ba2-46ea-b167-19480cb76a48","Type":"ContainerStarted","Data":"6037264b1633262a1100fcb1821dc817714766e32b75f77e22b448ff648f8b44"} Dec 10 12:17:35 crc kubenswrapper[4852]: I1210 12:17:35.547464 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-6l4ld"] Dec 10 12:17:36 crc kubenswrapper[4852]: I1210 12:17:36.279503 4852 generic.go:334] "Generic (PLEG): container finished" podID="cd10f498-e095-400a-b2d3-8f702ec64eec" containerID="feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0" exitCode=0 Dec 10 12:17:36 crc kubenswrapper[4852]: I1210 12:17:36.279627 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" event={"ID":"cd10f498-e095-400a-b2d3-8f702ec64eec","Type":"ContainerDied","Data":"feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0"} Dec 10 12:17:36 crc kubenswrapper[4852]: I1210 12:17:36.279913 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" event={"ID":"cd10f498-e095-400a-b2d3-8f702ec64eec","Type":"ContainerStarted","Data":"88e936ef14b415047042670bbc8c5f738228d066cac9470afac0b0fd1bba68cb"} Dec 10 12:17:37 crc kubenswrapper[4852]: I1210 12:17:37.291007 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" event={"ID":"cd10f498-e095-400a-b2d3-8f702ec64eec","Type":"ContainerStarted","Data":"1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7"} Dec 10 12:17:37 crc kubenswrapper[4852]: I1210 12:17:37.291501 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:37 crc kubenswrapper[4852]: I1210 12:17:37.321884 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" podStartSLOduration=3.321861931 podStartE2EDuration="3.321861931s" podCreationTimestamp="2025-12-10 12:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:17:37.310841996 +0000 UTC m=+1543.396367230" watchObservedRunningTime="2025-12-10 12:17:37.321861931 +0000 UTC m=+1543.407387195" Dec 10 12:17:43 crc kubenswrapper[4852]: I1210 12:17:43.929769 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nl8sq"] Dec 10 12:17:43 crc kubenswrapper[4852]: I1210 12:17:43.932570 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:43 crc kubenswrapper[4852]: I1210 12:17:43.941620 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nl8sq"] Dec 10 12:17:43 crc kubenswrapper[4852]: I1210 12:17:43.942506 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8zs4\" (UniqueName: \"kubernetes.io/projected/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-kube-api-access-z8zs4\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:43 crc kubenswrapper[4852]: I1210 12:17:43.942690 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-utilities\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:43 crc kubenswrapper[4852]: I1210 12:17:43.942795 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-catalog-content\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:44 crc kubenswrapper[4852]: I1210 12:17:44.044143 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8zs4\" (UniqueName: \"kubernetes.io/projected/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-kube-api-access-z8zs4\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:44 crc kubenswrapper[4852]: I1210 12:17:44.044224 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-utilities\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:44 crc kubenswrapper[4852]: I1210 12:17:44.044288 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-catalog-content\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:44 crc kubenswrapper[4852]: I1210 12:17:44.044798 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-utilities\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:44 crc kubenswrapper[4852]: I1210 12:17:44.044850 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-catalog-content\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:44 crc kubenswrapper[4852]: I1210 12:17:44.065743 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8zs4\" (UniqueName: \"kubernetes.io/projected/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-kube-api-access-z8zs4\") pod \"certified-operators-nl8sq\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:44 crc kubenswrapper[4852]: I1210 12:17:44.251295 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:44 crc kubenswrapper[4852]: I1210 12:17:44.756195 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nl8sq"] Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.089407 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.145337 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vxvl2"] Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.145598 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" podUID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" containerName="dnsmasq-dns" containerID="cri-o://b7241b7a62bb67de94d7caf316ae111319baeba133bfa65463dcf02e0b6c550c" gracePeriod=10 Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.364388 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-rl9rt"] Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.366826 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.374157 4852 generic.go:334] "Generic (PLEG): container finished" podID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerID="22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113" exitCode=0 Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.374275 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8sq" event={"ID":"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003","Type":"ContainerDied","Data":"22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113"} Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.374305 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8sq" event={"ID":"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003","Type":"ContainerStarted","Data":"a4cef9a5164d991dfb334a6f7b3af5f364050268c9f8679030198d5e193451b5"} Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.381788 4852 generic.go:334] "Generic (PLEG): container finished" podID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" containerID="b7241b7a62bb67de94d7caf316ae111319baeba133bfa65463dcf02e0b6c550c" exitCode=0 Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.381836 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" event={"ID":"e54e2f70-296d-4e1c-a293-72b7a09e1e35","Type":"ContainerDied","Data":"b7241b7a62bb67de94d7caf316ae111319baeba133bfa65463dcf02e0b6c550c"} Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.383334 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-rl9rt"] Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.474088 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.474139 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.474191 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.474218 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgzrm\" (UniqueName: \"kubernetes.io/projected/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-kube-api-access-bgzrm\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.474424 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.474475 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.474751 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-config\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.576462 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-config\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.576603 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.576640 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.576682 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.576717 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgzrm\" (UniqueName: \"kubernetes.io/projected/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-kube-api-access-bgzrm\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.576757 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.576783 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.578041 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.578819 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-config\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.579448 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.579910 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.579999 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.580457 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.612154 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgzrm\" (UniqueName: \"kubernetes.io/projected/4e3cbf64-e31a-4f5b-a045-8a3de2cba72b-kube-api-access-bgzrm\") pod \"dnsmasq-dns-78c64bc9c5-rl9rt\" (UID: \"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.684785 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.900255 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.991076 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-nb\") pod \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.991187 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwp8s\" (UniqueName: \"kubernetes.io/projected/e54e2f70-296d-4e1c-a293-72b7a09e1e35-kube-api-access-jwp8s\") pod \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.991264 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-sb\") pod \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.991327 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-svc\") pod \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.991374 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-swift-storage-0\") pod \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " Dec 10 12:17:45 crc kubenswrapper[4852]: I1210 12:17:45.991393 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-config\") pod \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\" (UID: \"e54e2f70-296d-4e1c-a293-72b7a09e1e35\") " Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.000873 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54e2f70-296d-4e1c-a293-72b7a09e1e35-kube-api-access-jwp8s" (OuterVolumeSpecName: "kube-api-access-jwp8s") pod "e54e2f70-296d-4e1c-a293-72b7a09e1e35" (UID: "e54e2f70-296d-4e1c-a293-72b7a09e1e35"). InnerVolumeSpecName "kube-api-access-jwp8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.067560 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e54e2f70-296d-4e1c-a293-72b7a09e1e35" (UID: "e54e2f70-296d-4e1c-a293-72b7a09e1e35"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.069502 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-config" (OuterVolumeSpecName: "config") pod "e54e2f70-296d-4e1c-a293-72b7a09e1e35" (UID: "e54e2f70-296d-4e1c-a293-72b7a09e1e35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.069705 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e54e2f70-296d-4e1c-a293-72b7a09e1e35" (UID: "e54e2f70-296d-4e1c-a293-72b7a09e1e35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.079411 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e54e2f70-296d-4e1c-a293-72b7a09e1e35" (UID: "e54e2f70-296d-4e1c-a293-72b7a09e1e35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.082616 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e54e2f70-296d-4e1c-a293-72b7a09e1e35" (UID: "e54e2f70-296d-4e1c-a293-72b7a09e1e35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.096745 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.096804 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwp8s\" (UniqueName: \"kubernetes.io/projected/e54e2f70-296d-4e1c-a293-72b7a09e1e35-kube-api-access-jwp8s\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.096821 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.096833 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.096847 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.096860 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54e2f70-296d-4e1c-a293-72b7a09e1e35-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.357577 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-rl9rt"] Dec 10 12:17:46 crc kubenswrapper[4852]: W1210 12:17:46.358751 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e3cbf64_e31a_4f5b_a045_8a3de2cba72b.slice/crio-958dfd4b79c9203d9586224f142a91dea2115821f9f24e7dbe008a976c157862 WatchSource:0}: Error finding container 958dfd4b79c9203d9586224f142a91dea2115821f9f24e7dbe008a976c157862: Status 404 returned error can't find the container with id 958dfd4b79c9203d9586224f142a91dea2115821f9f24e7dbe008a976c157862 Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.398335 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.398347 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vxvl2" event={"ID":"e54e2f70-296d-4e1c-a293-72b7a09e1e35","Type":"ContainerDied","Data":"c8495202f53c575b920450d450e69a7bc8553b11eb16caaea7dc0e39233b8d26"} Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.398407 4852 scope.go:117] "RemoveContainer" containerID="b7241b7a62bb67de94d7caf316ae111319baeba133bfa65463dcf02e0b6c550c" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.408613 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8sq" event={"ID":"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003","Type":"ContainerStarted","Data":"e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea"} Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.410505 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" event={"ID":"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b","Type":"ContainerStarted","Data":"958dfd4b79c9203d9586224f142a91dea2115821f9f24e7dbe008a976c157862"} Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.634877 4852 scope.go:117] "RemoveContainer" containerID="fbda619d8439d29cfc5c6e9954351b828053472ffc928c07ef12197aa6ba71f8" Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.666801 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vxvl2"] Dec 10 12:17:46 crc kubenswrapper[4852]: I1210 12:17:46.678935 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vxvl2"] Dec 10 12:17:47 crc kubenswrapper[4852]: I1210 12:17:47.422506 4852 generic.go:334] "Generic (PLEG): container finished" podID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerID="e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea" exitCode=0 Dec 10 12:17:47 crc kubenswrapper[4852]: I1210 12:17:47.422877 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8sq" event={"ID":"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003","Type":"ContainerDied","Data":"e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea"} Dec 10 12:17:47 crc kubenswrapper[4852]: I1210 12:17:47.427548 4852 generic.go:334] "Generic (PLEG): container finished" podID="4e3cbf64-e31a-4f5b-a045-8a3de2cba72b" containerID="43aae148b2c721a57ee9ac3b1cf4b9bd57dc8db67324a6e1ad2732b0ac376430" exitCode=0 Dec 10 12:17:47 crc kubenswrapper[4852]: I1210 12:17:47.428019 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" event={"ID":"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b","Type":"ContainerDied","Data":"43aae148b2c721a57ee9ac3b1cf4b9bd57dc8db67324a6e1ad2732b0ac376430"} Dec 10 12:17:48 crc kubenswrapper[4852]: I1210 12:17:48.186819 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" path="/var/lib/kubelet/pods/e54e2f70-296d-4e1c-a293-72b7a09e1e35/volumes" Dec 10 12:17:48 crc kubenswrapper[4852]: I1210 12:17:48.440716 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" event={"ID":"4e3cbf64-e31a-4f5b-a045-8a3de2cba72b","Type":"ContainerStarted","Data":"43310d654df6bffaf7997f9b1b7a44098cffd0b968eb8c45eabde8d331194049"} Dec 10 12:17:48 crc kubenswrapper[4852]: I1210 12:17:48.440972 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:48 crc kubenswrapper[4852]: I1210 12:17:48.459774 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" podStartSLOduration=3.45975774 podStartE2EDuration="3.45975774s" podCreationTimestamp="2025-12-10 12:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:17:48.456605631 +0000 UTC m=+1554.542130865" watchObservedRunningTime="2025-12-10 12:17:48.45975774 +0000 UTC m=+1554.545282964" Dec 10 12:17:50 crc kubenswrapper[4852]: I1210 12:17:50.461948 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8sq" event={"ID":"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003","Type":"ContainerStarted","Data":"62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711"} Dec 10 12:17:50 crc kubenswrapper[4852]: I1210 12:17:50.488331 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nl8sq" podStartSLOduration=3.49424215 podStartE2EDuration="7.488306907s" podCreationTimestamp="2025-12-10 12:17:43 +0000 UTC" firstStartedPulling="2025-12-10 12:17:45.378111728 +0000 UTC m=+1551.463636952" lastFinishedPulling="2025-12-10 12:17:49.372176485 +0000 UTC m=+1555.457701709" observedRunningTime="2025-12-10 12:17:50.483343173 +0000 UTC m=+1556.568868397" watchObservedRunningTime="2025-12-10 12:17:50.488306907 +0000 UTC m=+1556.573832131" Dec 10 12:17:54 crc kubenswrapper[4852]: I1210 12:17:54.251830 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:54 crc kubenswrapper[4852]: I1210 12:17:54.252728 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:54 crc kubenswrapper[4852]: I1210 12:17:54.314442 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:55 crc kubenswrapper[4852]: I1210 12:17:55.578802 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:55 crc kubenswrapper[4852]: I1210 12:17:55.642831 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nl8sq"] Dec 10 12:17:55 crc kubenswrapper[4852]: I1210 12:17:55.688014 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-rl9rt" Dec 10 12:17:55 crc kubenswrapper[4852]: I1210 12:17:55.765593 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-6l4ld"] Dec 10 12:17:55 crc kubenswrapper[4852]: I1210 12:17:55.765811 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" podUID="cd10f498-e095-400a-b2d3-8f702ec64eec" containerName="dnsmasq-dns" containerID="cri-o://1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7" gracePeriod=10 Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.298707 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.418593 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-openstack-edpm-ipam\") pod \"cd10f498-e095-400a-b2d3-8f702ec64eec\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.418724 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vmcr\" (UniqueName: \"kubernetes.io/projected/cd10f498-e095-400a-b2d3-8f702ec64eec-kube-api-access-9vmcr\") pod \"cd10f498-e095-400a-b2d3-8f702ec64eec\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.418757 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-swift-storage-0\") pod \"cd10f498-e095-400a-b2d3-8f702ec64eec\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.418809 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-svc\") pod \"cd10f498-e095-400a-b2d3-8f702ec64eec\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.418849 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-sb\") pod \"cd10f498-e095-400a-b2d3-8f702ec64eec\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.418910 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-nb\") pod \"cd10f498-e095-400a-b2d3-8f702ec64eec\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.418956 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-config\") pod \"cd10f498-e095-400a-b2d3-8f702ec64eec\" (UID: \"cd10f498-e095-400a-b2d3-8f702ec64eec\") " Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.445331 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd10f498-e095-400a-b2d3-8f702ec64eec-kube-api-access-9vmcr" (OuterVolumeSpecName: "kube-api-access-9vmcr") pod "cd10f498-e095-400a-b2d3-8f702ec64eec" (UID: "cd10f498-e095-400a-b2d3-8f702ec64eec"). InnerVolumeSpecName "kube-api-access-9vmcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.474711 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd10f498-e095-400a-b2d3-8f702ec64eec" (UID: "cd10f498-e095-400a-b2d3-8f702ec64eec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.475245 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "cd10f498-e095-400a-b2d3-8f702ec64eec" (UID: "cd10f498-e095-400a-b2d3-8f702ec64eec"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.476280 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-config" (OuterVolumeSpecName: "config") pod "cd10f498-e095-400a-b2d3-8f702ec64eec" (UID: "cd10f498-e095-400a-b2d3-8f702ec64eec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.478334 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd10f498-e095-400a-b2d3-8f702ec64eec" (UID: "cd10f498-e095-400a-b2d3-8f702ec64eec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.483993 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd10f498-e095-400a-b2d3-8f702ec64eec" (UID: "cd10f498-e095-400a-b2d3-8f702ec64eec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.486115 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cd10f498-e095-400a-b2d3-8f702ec64eec" (UID: "cd10f498-e095-400a-b2d3-8f702ec64eec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.523432 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vmcr\" (UniqueName: \"kubernetes.io/projected/cd10f498-e095-400a-b2d3-8f702ec64eec-kube-api-access-9vmcr\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.523587 4852 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.523671 4852 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.523720 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.523736 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.523665 4852 generic.go:334] "Generic (PLEG): container finished" podID="cd10f498-e095-400a-b2d3-8f702ec64eec" containerID="1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7" exitCode=0 Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.523721 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" event={"ID":"cd10f498-e095-400a-b2d3-8f702ec64eec","Type":"ContainerDied","Data":"1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7"} Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.523995 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-6l4ld" event={"ID":"cd10f498-e095-400a-b2d3-8f702ec64eec","Type":"ContainerDied","Data":"88e936ef14b415047042670bbc8c5f738228d066cac9470afac0b0fd1bba68cb"} Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.524040 4852 scope.go:117] "RemoveContainer" containerID="1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.524159 4852 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.524354 4852 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.524378 4852 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cd10f498-e095-400a-b2d3-8f702ec64eec-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.561128 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-6l4ld"] Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.567604 4852 scope.go:117] "RemoveContainer" containerID="feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.569375 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-6l4ld"] Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.586396 4852 scope.go:117] "RemoveContainer" containerID="1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7" Dec 10 12:17:56 crc kubenswrapper[4852]: E1210 12:17:56.586918 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7\": container with ID starting with 1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7 not found: ID does not exist" containerID="1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.586941 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7"} err="failed to get container status \"1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7\": rpc error: code = NotFound desc = could not find container \"1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7\": container with ID starting with 1d34118c7c2c6df8531d713155e26e928001854ce8824645f4a0ea5b564addf7 not found: ID does not exist" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.586961 4852 scope.go:117] "RemoveContainer" containerID="feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0" Dec 10 12:17:56 crc kubenswrapper[4852]: E1210 12:17:56.587338 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0\": container with ID starting with feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0 not found: ID does not exist" containerID="feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0" Dec 10 12:17:56 crc kubenswrapper[4852]: I1210 12:17:56.587367 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0"} err="failed to get container status \"feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0\": rpc error: code = NotFound desc = could not find container \"feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0\": container with ID starting with feecf171e590b824ef0b5985cb33accab5d689e9996489ec42f577b665acb9b0 not found: ID does not exist" Dec 10 12:17:57 crc kubenswrapper[4852]: I1210 12:17:57.535683 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nl8sq" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerName="registry-server" containerID="cri-o://62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711" gracePeriod=2 Dec 10 12:17:58 crc kubenswrapper[4852]: I1210 12:17:58.180353 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd10f498-e095-400a-b2d3-8f702ec64eec" path="/var/lib/kubelet/pods/cd10f498-e095-400a-b2d3-8f702ec64eec/volumes" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.429581 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.558991 4852 generic.go:334] "Generic (PLEG): container finished" podID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerID="62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711" exitCode=0 Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.559041 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl8sq" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.559062 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8sq" event={"ID":"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003","Type":"ContainerDied","Data":"62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711"} Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.559741 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl8sq" event={"ID":"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003","Type":"ContainerDied","Data":"a4cef9a5164d991dfb334a6f7b3af5f364050268c9f8679030198d5e193451b5"} Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.559760 4852 scope.go:117] "RemoveContainer" containerID="62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.585227 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-catalog-content\") pod \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.585356 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8zs4\" (UniqueName: \"kubernetes.io/projected/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-kube-api-access-z8zs4\") pod \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.585455 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-utilities\") pod \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\" (UID: \"3be5ff1b-1d3e-4d1d-9187-67f7c6a58003\") " Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.585987 4852 scope.go:117] "RemoveContainer" containerID="e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.586717 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-utilities" (OuterVolumeSpecName: "utilities") pod "3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" (UID: "3be5ff1b-1d3e-4d1d-9187-67f7c6a58003"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.590670 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-kube-api-access-z8zs4" (OuterVolumeSpecName: "kube-api-access-z8zs4") pod "3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" (UID: "3be5ff1b-1d3e-4d1d-9187-67f7c6a58003"). InnerVolumeSpecName "kube-api-access-z8zs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.634678 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" (UID: "3be5ff1b-1d3e-4d1d-9187-67f7c6a58003"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.642199 4852 scope.go:117] "RemoveContainer" containerID="22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.687710 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.687745 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.687760 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8zs4\" (UniqueName: \"kubernetes.io/projected/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003-kube-api-access-z8zs4\") on node \"crc\" DevicePath \"\"" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.688289 4852 scope.go:117] "RemoveContainer" containerID="62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711" Dec 10 12:17:59 crc kubenswrapper[4852]: E1210 12:17:59.689035 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711\": container with ID starting with 62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711 not found: ID does not exist" containerID="62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.689072 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711"} err="failed to get container status \"62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711\": rpc error: code = NotFound desc = could not find container \"62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711\": container with ID starting with 62ea8f04ea48b27e5ad3d3799286b938f919375fdd776c92d9a05dc066f59711 not found: ID does not exist" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.689121 4852 scope.go:117] "RemoveContainer" containerID="e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea" Dec 10 12:17:59 crc kubenswrapper[4852]: E1210 12:17:59.689586 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea\": container with ID starting with e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea not found: ID does not exist" containerID="e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.689622 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea"} err="failed to get container status \"e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea\": rpc error: code = NotFound desc = could not find container \"e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea\": container with ID starting with e39b6d9716c93bf189c30a9a1e20eef48fa636a39852e1eabdb04361d6e772ea not found: ID does not exist" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.689649 4852 scope.go:117] "RemoveContainer" containerID="22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113" Dec 10 12:17:59 crc kubenswrapper[4852]: E1210 12:17:59.690100 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113\": container with ID starting with 22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113 not found: ID does not exist" containerID="22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.690179 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113"} err="failed to get container status \"22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113\": rpc error: code = NotFound desc = could not find container \"22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113\": container with ID starting with 22fa68a522eaef44dae9340a3e372ee595a544bdb64156dea6aa18f3ea271113 not found: ID does not exist" Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.919729 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nl8sq"] Dec 10 12:17:59 crc kubenswrapper[4852]: I1210 12:17:59.926981 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nl8sq"] Dec 10 12:18:00 crc kubenswrapper[4852]: I1210 12:18:00.185460 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" path="/var/lib/kubelet/pods/3be5ff1b-1d3e-4d1d-9187-67f7c6a58003/volumes" Dec 10 12:18:02 crc kubenswrapper[4852]: I1210 12:18:02.967327 4852 scope.go:117] "RemoveContainer" containerID="ea77f26c94cc33aac957f4c873d4145eb53a779c953ed08b5e911dbdea82a9da" Dec 10 12:18:03 crc kubenswrapper[4852]: I1210 12:18:03.010394 4852 scope.go:117] "RemoveContainer" containerID="5ceacbd6d1013e770d05fe646f2feaee65d35997b19a46b693ae435d4dfe80ba" Dec 10 12:18:06 crc kubenswrapper[4852]: I1210 12:18:06.635412 4852 generic.go:334] "Generic (PLEG): container finished" podID="63001b32-e957-4b24-a742-7932191e7598" containerID="5404e488e4f3bee4bf2884a29f3f8d831b9aceb9af47fc7b7424c607cf87ce24" exitCode=0 Dec 10 12:18:06 crc kubenswrapper[4852]: I1210 12:18:06.635501 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"63001b32-e957-4b24-a742-7932191e7598","Type":"ContainerDied","Data":"5404e488e4f3bee4bf2884a29f3f8d831b9aceb9af47fc7b7424c607cf87ce24"} Dec 10 12:18:06 crc kubenswrapper[4852]: I1210 12:18:06.639186 4852 generic.go:334] "Generic (PLEG): container finished" podID="280ccc25-3ba2-46ea-b167-19480cb76a48" containerID="6037264b1633262a1100fcb1821dc817714766e32b75f77e22b448ff648f8b44" exitCode=0 Dec 10 12:18:06 crc kubenswrapper[4852]: I1210 12:18:06.639278 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"280ccc25-3ba2-46ea-b167-19480cb76a48","Type":"ContainerDied","Data":"6037264b1633262a1100fcb1821dc817714766e32b75f77e22b448ff648f8b44"} Dec 10 12:18:07 crc kubenswrapper[4852]: I1210 12:18:07.650940 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"63001b32-e957-4b24-a742-7932191e7598","Type":"ContainerStarted","Data":"235d901b0a392be9ba08b08ae9875fe423e7124d317fcfeddb0628ea8239d64f"} Dec 10 12:18:07 crc kubenswrapper[4852]: I1210 12:18:07.652692 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:18:07 crc kubenswrapper[4852]: I1210 12:18:07.654831 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"280ccc25-3ba2-46ea-b167-19480cb76a48","Type":"ContainerStarted","Data":"538b4a9093a0062431f260694c4617e7426126c689ade7082eb11465449d8caf"} Dec 10 12:18:07 crc kubenswrapper[4852]: I1210 12:18:07.655529 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 10 12:18:07 crc kubenswrapper[4852]: I1210 12:18:07.678352 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.678331867 podStartE2EDuration="35.678331867s" podCreationTimestamp="2025-12-10 12:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:18:07.675495146 +0000 UTC m=+1573.761020390" watchObservedRunningTime="2025-12-10 12:18:07.678331867 +0000 UTC m=+1573.763857091" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.222781 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.222759346 podStartE2EDuration="37.222759346s" podCreationTimestamp="2025-12-10 12:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:18:07.717054465 +0000 UTC m=+1573.802579689" watchObservedRunningTime="2025-12-10 12:18:08.222759346 +0000 UTC m=+1574.308284570" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.228414 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j"] Dec 10 12:18:08 crc kubenswrapper[4852]: E1210 12:18:08.228793 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd10f498-e095-400a-b2d3-8f702ec64eec" containerName="init" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.228809 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd10f498-e095-400a-b2d3-8f702ec64eec" containerName="init" Dec 10 12:18:08 crc kubenswrapper[4852]: E1210 12:18:08.228823 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerName="extract-utilities" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.228831 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerName="extract-utilities" Dec 10 12:18:08 crc kubenswrapper[4852]: E1210 12:18:08.228844 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerName="extract-content" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.228850 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerName="extract-content" Dec 10 12:18:08 crc kubenswrapper[4852]: E1210 12:18:08.228866 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" containerName="init" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.228871 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" containerName="init" Dec 10 12:18:08 crc kubenswrapper[4852]: E1210 12:18:08.228883 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" containerName="dnsmasq-dns" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.228890 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" containerName="dnsmasq-dns" Dec 10 12:18:08 crc kubenswrapper[4852]: E1210 12:18:08.228908 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd10f498-e095-400a-b2d3-8f702ec64eec" containerName="dnsmasq-dns" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.228914 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd10f498-e095-400a-b2d3-8f702ec64eec" containerName="dnsmasq-dns" Dec 10 12:18:08 crc kubenswrapper[4852]: E1210 12:18:08.228935 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerName="registry-server" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.228940 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerName="registry-server" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.229113 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be5ff1b-1d3e-4d1d-9187-67f7c6a58003" containerName="registry-server" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.229136 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd10f498-e095-400a-b2d3-8f702ec64eec" containerName="dnsmasq-dns" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.229147 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54e2f70-296d-4e1c-a293-72b7a09e1e35" containerName="dnsmasq-dns" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.229816 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.233591 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.234221 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.234388 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.234842 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.248173 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j"] Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.265961 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.266007 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.266088 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.266104 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nk7q\" (UniqueName: \"kubernetes.io/projected/19836285-fe41-4d6e-8f05-b5aeac635c5c-kube-api-access-4nk7q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.367691 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.367743 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.367833 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nk7q\" (UniqueName: \"kubernetes.io/projected/19836285-fe41-4d6e-8f05-b5aeac635c5c-kube-api-access-4nk7q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.367857 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.372762 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.373963 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.375668 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.389254 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nk7q\" (UniqueName: \"kubernetes.io/projected/19836285-fe41-4d6e-8f05-b5aeac635c5c-kube-api-access-4nk7q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:08 crc kubenswrapper[4852]: I1210 12:18:08.553114 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:18:09 crc kubenswrapper[4852]: I1210 12:18:09.143699 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j"] Dec 10 12:18:09 crc kubenswrapper[4852]: I1210 12:18:09.678527 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" event={"ID":"19836285-fe41-4d6e-8f05-b5aeac635c5c","Type":"ContainerStarted","Data":"05766f170b9dc3c18a192fc9f35a7e4eb852918dd72c885bce73e5550667dc2b"} Dec 10 12:18:20 crc kubenswrapper[4852]: I1210 12:18:20.803452 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" event={"ID":"19836285-fe41-4d6e-8f05-b5aeac635c5c","Type":"ContainerStarted","Data":"70c34e272280c32870e5e3514bb13b56d28f3c567b36b8e54323257ae2213411"} Dec 10 12:18:20 crc kubenswrapper[4852]: I1210 12:18:20.825216 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" podStartSLOduration=2.106068209 podStartE2EDuration="12.825187371s" podCreationTimestamp="2025-12-10 12:18:08 +0000 UTC" firstStartedPulling="2025-12-10 12:18:09.157757717 +0000 UTC m=+1575.243282951" lastFinishedPulling="2025-12-10 12:18:19.876876889 +0000 UTC m=+1585.962402113" observedRunningTime="2025-12-10 12:18:20.820282129 +0000 UTC m=+1586.905807393" watchObservedRunningTime="2025-12-10 12:18:20.825187371 +0000 UTC m=+1586.910712625" Dec 10 12:18:22 crc kubenswrapper[4852]: I1210 12:18:22.034554 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 10 12:18:22 crc kubenswrapper[4852]: I1210 12:18:22.637487 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 10 12:19:03 crc kubenswrapper[4852]: I1210 12:19:03.124076 4852 scope.go:117] "RemoveContainer" containerID="a0bba005ffdf5ec712b59d4c1ea55ae6093e224ae6d4642a76152003f665a9dd" Dec 10 12:19:03 crc kubenswrapper[4852]: I1210 12:19:03.192214 4852 scope.go:117] "RemoveContainer" containerID="f0a8c30e5408dcb44ac3d5a10083d6fd81f0da70fcc45393280458a7d25d7a8c" Dec 10 12:19:03 crc kubenswrapper[4852]: I1210 12:19:03.247192 4852 scope.go:117] "RemoveContainer" containerID="5a3b36140f09aca820be9e1a35063ca5d596d2e26409271f171f8e0acf5346da" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.355192 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qrsn8"] Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.358677 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.379126 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrsn8"] Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.527912 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk8br\" (UniqueName: \"kubernetes.io/projected/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-kube-api-access-tk8br\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.528039 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-catalog-content\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.528100 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-utilities\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.629708 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-utilities\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.629825 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk8br\" (UniqueName: \"kubernetes.io/projected/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-kube-api-access-tk8br\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.629930 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-catalog-content\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.630304 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-utilities\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.630327 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-catalog-content\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.657266 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk8br\" (UniqueName: \"kubernetes.io/projected/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-kube-api-access-tk8br\") pod \"community-operators-qrsn8\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:05 crc kubenswrapper[4852]: I1210 12:19:05.677996 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:06 crc kubenswrapper[4852]: I1210 12:19:06.230101 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrsn8"] Dec 10 12:19:06 crc kubenswrapper[4852]: I1210 12:19:06.274834 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrsn8" event={"ID":"b71c87ad-48a9-40fa-b360-dd3b31ec3d60","Type":"ContainerStarted","Data":"76d17cd16907eb3f10cd38bad1b8f339e4079fee845e63b5bf10352d61b03ec8"} Dec 10 12:19:07 crc kubenswrapper[4852]: I1210 12:19:07.300865 4852 generic.go:334] "Generic (PLEG): container finished" podID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerID="9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507" exitCode=0 Dec 10 12:19:07 crc kubenswrapper[4852]: I1210 12:19:07.301212 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrsn8" event={"ID":"b71c87ad-48a9-40fa-b360-dd3b31ec3d60","Type":"ContainerDied","Data":"9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507"} Dec 10 12:19:10 crc kubenswrapper[4852]: I1210 12:19:10.334196 4852 generic.go:334] "Generic (PLEG): container finished" podID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerID="b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2" exitCode=0 Dec 10 12:19:10 crc kubenswrapper[4852]: I1210 12:19:10.334292 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrsn8" event={"ID":"b71c87ad-48a9-40fa-b360-dd3b31ec3d60","Type":"ContainerDied","Data":"b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2"} Dec 10 12:19:12 crc kubenswrapper[4852]: I1210 12:19:12.355037 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrsn8" event={"ID":"b71c87ad-48a9-40fa-b360-dd3b31ec3d60","Type":"ContainerStarted","Data":"42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745"} Dec 10 12:19:12 crc kubenswrapper[4852]: I1210 12:19:12.379038 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qrsn8" podStartSLOduration=2.863986438 podStartE2EDuration="7.379017883s" podCreationTimestamp="2025-12-10 12:19:05 +0000 UTC" firstStartedPulling="2025-12-10 12:19:07.305026075 +0000 UTC m=+1633.390551299" lastFinishedPulling="2025-12-10 12:19:11.82005751 +0000 UTC m=+1637.905582744" observedRunningTime="2025-12-10 12:19:12.371161007 +0000 UTC m=+1638.456686221" watchObservedRunningTime="2025-12-10 12:19:12.379017883 +0000 UTC m=+1638.464543107" Dec 10 12:19:15 crc kubenswrapper[4852]: I1210 12:19:15.678173 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:15 crc kubenswrapper[4852]: I1210 12:19:15.678509 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:15 crc kubenswrapper[4852]: I1210 12:19:15.767541 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:15 crc kubenswrapper[4852]: I1210 12:19:15.790527 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:19:15 crc kubenswrapper[4852]: I1210 12:19:15.790593 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:19:25 crc kubenswrapper[4852]: I1210 12:19:25.733165 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:25 crc kubenswrapper[4852]: I1210 12:19:25.794665 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrsn8"] Dec 10 12:19:26 crc kubenswrapper[4852]: I1210 12:19:26.503641 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qrsn8" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerName="registry-server" containerID="cri-o://42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745" gracePeriod=2 Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:26.942475 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.048345 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-catalog-content\") pod \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.048404 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk8br\" (UniqueName: \"kubernetes.io/projected/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-kube-api-access-tk8br\") pod \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.049652 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-utilities\") pod \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\" (UID: \"b71c87ad-48a9-40fa-b360-dd3b31ec3d60\") " Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.050544 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-utilities" (OuterVolumeSpecName: "utilities") pod "b71c87ad-48a9-40fa-b360-dd3b31ec3d60" (UID: "b71c87ad-48a9-40fa-b360-dd3b31ec3d60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.059711 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-kube-api-access-tk8br" (OuterVolumeSpecName: "kube-api-access-tk8br") pod "b71c87ad-48a9-40fa-b360-dd3b31ec3d60" (UID: "b71c87ad-48a9-40fa-b360-dd3b31ec3d60"). InnerVolumeSpecName "kube-api-access-tk8br". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.104818 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b71c87ad-48a9-40fa-b360-dd3b31ec3d60" (UID: "b71c87ad-48a9-40fa-b360-dd3b31ec3d60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.151832 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.151875 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk8br\" (UniqueName: \"kubernetes.io/projected/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-kube-api-access-tk8br\") on node \"crc\" DevicePath \"\"" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.151891 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71c87ad-48a9-40fa-b360-dd3b31ec3d60-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.515712 4852 generic.go:334] "Generic (PLEG): container finished" podID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerID="42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745" exitCode=0 Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.515773 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrsn8" event={"ID":"b71c87ad-48a9-40fa-b360-dd3b31ec3d60","Type":"ContainerDied","Data":"42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745"} Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.515789 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrsn8" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.515819 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrsn8" event={"ID":"b71c87ad-48a9-40fa-b360-dd3b31ec3d60","Type":"ContainerDied","Data":"76d17cd16907eb3f10cd38bad1b8f339e4079fee845e63b5bf10352d61b03ec8"} Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.515848 4852 scope.go:117] "RemoveContainer" containerID="42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.545804 4852 scope.go:117] "RemoveContainer" containerID="b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.546577 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrsn8"] Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.555131 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qrsn8"] Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.565039 4852 scope.go:117] "RemoveContainer" containerID="9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.603812 4852 scope.go:117] "RemoveContainer" containerID="42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745" Dec 10 12:19:27 crc kubenswrapper[4852]: E1210 12:19:27.604163 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745\": container with ID starting with 42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745 not found: ID does not exist" containerID="42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.604190 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745"} err="failed to get container status \"42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745\": rpc error: code = NotFound desc = could not find container \"42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745\": container with ID starting with 42bbcc47eb59f02238b1bcdc6540d68dd0ece28742eed62896ffb04a57a93745 not found: ID does not exist" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.604212 4852 scope.go:117] "RemoveContainer" containerID="b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2" Dec 10 12:19:27 crc kubenswrapper[4852]: E1210 12:19:27.604509 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2\": container with ID starting with b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2 not found: ID does not exist" containerID="b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.604528 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2"} err="failed to get container status \"b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2\": rpc error: code = NotFound desc = could not find container \"b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2\": container with ID starting with b44dbe1a5c55899957a7b87816c30f44da1bdaff59237e4be8d56256d4473aa2 not found: ID does not exist" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.604541 4852 scope.go:117] "RemoveContainer" containerID="9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507" Dec 10 12:19:27 crc kubenswrapper[4852]: E1210 12:19:27.604741 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507\": container with ID starting with 9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507 not found: ID does not exist" containerID="9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507" Dec 10 12:19:27 crc kubenswrapper[4852]: I1210 12:19:27.604762 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507"} err="failed to get container status \"9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507\": rpc error: code = NotFound desc = could not find container \"9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507\": container with ID starting with 9b95132bf06de366e64c0d8bef438be90033806b396165b2356c2d40edcb5507 not found: ID does not exist" Dec 10 12:19:28 crc kubenswrapper[4852]: I1210 12:19:28.204419 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" path="/var/lib/kubelet/pods/b71c87ad-48a9-40fa-b360-dd3b31ec3d60/volumes" Dec 10 12:19:45 crc kubenswrapper[4852]: I1210 12:19:45.790216 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:19:45 crc kubenswrapper[4852]: I1210 12:19:45.790744 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.790682 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.791729 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.791806 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.793122 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.793190 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" gracePeriod=600 Dec 10 12:20:15 crc kubenswrapper[4852]: E1210 12:20:15.916870 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.992310 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" exitCode=0 Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.992382 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559"} Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.992445 4852 scope.go:117] "RemoveContainer" containerID="3c62f223218c8d67bf458bba29b25f48f874ad6d23f1af6c44094e9bc123c137" Dec 10 12:20:15 crc kubenswrapper[4852]: I1210 12:20:15.993433 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:20:15 crc kubenswrapper[4852]: E1210 12:20:15.993950 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:20:17 crc kubenswrapper[4852]: I1210 12:20:17.004595 4852 generic.go:334] "Generic (PLEG): container finished" podID="19836285-fe41-4d6e-8f05-b5aeac635c5c" containerID="70c34e272280c32870e5e3514bb13b56d28f3c567b36b8e54323257ae2213411" exitCode=0 Dec 10 12:20:17 crc kubenswrapper[4852]: I1210 12:20:17.004705 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" event={"ID":"19836285-fe41-4d6e-8f05-b5aeac635c5c","Type":"ContainerDied","Data":"70c34e272280c32870e5e3514bb13b56d28f3c567b36b8e54323257ae2213411"} Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.454302 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.473176 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-inventory\") pod \"19836285-fe41-4d6e-8f05-b5aeac635c5c\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.473968 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-ssh-key\") pod \"19836285-fe41-4d6e-8f05-b5aeac635c5c\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.501809 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-inventory" (OuterVolumeSpecName: "inventory") pod "19836285-fe41-4d6e-8f05-b5aeac635c5c" (UID: "19836285-fe41-4d6e-8f05-b5aeac635c5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.505949 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "19836285-fe41-4d6e-8f05-b5aeac635c5c" (UID: "19836285-fe41-4d6e-8f05-b5aeac635c5c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.575690 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-repo-setup-combined-ca-bundle\") pod \"19836285-fe41-4d6e-8f05-b5aeac635c5c\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.575735 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nk7q\" (UniqueName: \"kubernetes.io/projected/19836285-fe41-4d6e-8f05-b5aeac635c5c-kube-api-access-4nk7q\") pod \"19836285-fe41-4d6e-8f05-b5aeac635c5c\" (UID: \"19836285-fe41-4d6e-8f05-b5aeac635c5c\") " Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.576087 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.576109 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.579740 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19836285-fe41-4d6e-8f05-b5aeac635c5c-kube-api-access-4nk7q" (OuterVolumeSpecName: "kube-api-access-4nk7q") pod "19836285-fe41-4d6e-8f05-b5aeac635c5c" (UID: "19836285-fe41-4d6e-8f05-b5aeac635c5c"). InnerVolumeSpecName "kube-api-access-4nk7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.579915 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "19836285-fe41-4d6e-8f05-b5aeac635c5c" (UID: "19836285-fe41-4d6e-8f05-b5aeac635c5c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.677331 4852 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19836285-fe41-4d6e-8f05-b5aeac635c5c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:20:18 crc kubenswrapper[4852]: I1210 12:20:18.677542 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nk7q\" (UniqueName: \"kubernetes.io/projected/19836285-fe41-4d6e-8f05-b5aeac635c5c-kube-api-access-4nk7q\") on node \"crc\" DevicePath \"\"" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.028302 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" event={"ID":"19836285-fe41-4d6e-8f05-b5aeac635c5c","Type":"ContainerDied","Data":"05766f170b9dc3c18a192fc9f35a7e4eb852918dd72c885bce73e5550667dc2b"} Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.028345 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05766f170b9dc3c18a192fc9f35a7e4eb852918dd72c885bce73e5550667dc2b" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.028635 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.106976 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc"] Dec 10 12:20:19 crc kubenswrapper[4852]: E1210 12:20:19.107497 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerName="extract-content" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.107513 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerName="extract-content" Dec 10 12:20:19 crc kubenswrapper[4852]: E1210 12:20:19.107529 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerName="extract-utilities" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.107536 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerName="extract-utilities" Dec 10 12:20:19 crc kubenswrapper[4852]: E1210 12:20:19.107549 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerName="registry-server" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.107555 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerName="registry-server" Dec 10 12:20:19 crc kubenswrapper[4852]: E1210 12:20:19.107563 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19836285-fe41-4d6e-8f05-b5aeac635c5c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.107570 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="19836285-fe41-4d6e-8f05-b5aeac635c5c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.107750 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="19836285-fe41-4d6e-8f05-b5aeac635c5c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.107764 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71c87ad-48a9-40fa-b360-dd3b31ec3d60" containerName="registry-server" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.117179 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.120182 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.120526 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.120687 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.121130 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc"] Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.122000 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.187014 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.187086 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.187162 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7crf\" (UniqueName: \"kubernetes.io/projected/2bdc2caf-227b-4210-bdbd-adf085cf4e27-kube-api-access-k7crf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.288992 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7crf\" (UniqueName: \"kubernetes.io/projected/2bdc2caf-227b-4210-bdbd-adf085cf4e27-kube-api-access-k7crf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.289130 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.289179 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.292874 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.292928 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.305608 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7crf\" (UniqueName: \"kubernetes.io/projected/2bdc2caf-227b-4210-bdbd-adf085cf4e27-kube-api-access-k7crf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf6kc\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.454463 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.967070 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:20:19 crc kubenswrapper[4852]: I1210 12:20:19.968609 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc"] Dec 10 12:20:20 crc kubenswrapper[4852]: I1210 12:20:20.038537 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" event={"ID":"2bdc2caf-227b-4210-bdbd-adf085cf4e27","Type":"ContainerStarted","Data":"623b167be28cb4456ddbe3cd868968b14689390ed79db66e227eabe72395bb10"} Dec 10 12:20:22 crc kubenswrapper[4852]: I1210 12:20:22.057919 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" event={"ID":"2bdc2caf-227b-4210-bdbd-adf085cf4e27","Type":"ContainerStarted","Data":"51db30964a9e253815fcc9dd75c8a081210b1e8e00c24e5f248779c34baaff92"} Dec 10 12:20:22 crc kubenswrapper[4852]: I1210 12:20:22.076035 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" podStartSLOduration=1.87060914 podStartE2EDuration="3.076018615s" podCreationTimestamp="2025-12-10 12:20:19 +0000 UTC" firstStartedPulling="2025-12-10 12:20:19.966762265 +0000 UTC m=+1706.052287489" lastFinishedPulling="2025-12-10 12:20:21.17217174 +0000 UTC m=+1707.257696964" observedRunningTime="2025-12-10 12:20:22.074055536 +0000 UTC m=+1708.159580760" watchObservedRunningTime="2025-12-10 12:20:22.076018615 +0000 UTC m=+1708.161543839" Dec 10 12:20:24 crc kubenswrapper[4852]: I1210 12:20:24.077645 4852 generic.go:334] "Generic (PLEG): container finished" podID="2bdc2caf-227b-4210-bdbd-adf085cf4e27" containerID="51db30964a9e253815fcc9dd75c8a081210b1e8e00c24e5f248779c34baaff92" exitCode=0 Dec 10 12:20:24 crc kubenswrapper[4852]: I1210 12:20:24.077717 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" event={"ID":"2bdc2caf-227b-4210-bdbd-adf085cf4e27","Type":"ContainerDied","Data":"51db30964a9e253815fcc9dd75c8a081210b1e8e00c24e5f248779c34baaff92"} Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.482207 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.613493 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-inventory\") pod \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.613579 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-ssh-key\") pod \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.613747 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7crf\" (UniqueName: \"kubernetes.io/projected/2bdc2caf-227b-4210-bdbd-adf085cf4e27-kube-api-access-k7crf\") pod \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\" (UID: \"2bdc2caf-227b-4210-bdbd-adf085cf4e27\") " Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.619782 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bdc2caf-227b-4210-bdbd-adf085cf4e27-kube-api-access-k7crf" (OuterVolumeSpecName: "kube-api-access-k7crf") pod "2bdc2caf-227b-4210-bdbd-adf085cf4e27" (UID: "2bdc2caf-227b-4210-bdbd-adf085cf4e27"). InnerVolumeSpecName "kube-api-access-k7crf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.643025 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-inventory" (OuterVolumeSpecName: "inventory") pod "2bdc2caf-227b-4210-bdbd-adf085cf4e27" (UID: "2bdc2caf-227b-4210-bdbd-adf085cf4e27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.646744 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2bdc2caf-227b-4210-bdbd-adf085cf4e27" (UID: "2bdc2caf-227b-4210-bdbd-adf085cf4e27"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.716630 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.716687 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bdc2caf-227b-4210-bdbd-adf085cf4e27-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:20:25 crc kubenswrapper[4852]: I1210 12:20:25.716708 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7crf\" (UniqueName: \"kubernetes.io/projected/2bdc2caf-227b-4210-bdbd-adf085cf4e27-kube-api-access-k7crf\") on node \"crc\" DevicePath \"\"" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.101463 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" event={"ID":"2bdc2caf-227b-4210-bdbd-adf085cf4e27","Type":"ContainerDied","Data":"623b167be28cb4456ddbe3cd868968b14689390ed79db66e227eabe72395bb10"} Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.101517 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="623b167be28cb4456ddbe3cd868968b14689390ed79db66e227eabe72395bb10" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.101524 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf6kc" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.189476 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp"] Dec 10 12:20:26 crc kubenswrapper[4852]: E1210 12:20:26.189984 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdc2caf-227b-4210-bdbd-adf085cf4e27" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.190011 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdc2caf-227b-4210-bdbd-adf085cf4e27" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.190280 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bdc2caf-227b-4210-bdbd-adf085cf4e27" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.191636 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp"] Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.191732 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.195266 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.195907 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.196054 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.196202 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.332723 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lg8p\" (UniqueName: \"kubernetes.io/projected/3ed1622b-fe84-4402-b15c-6971dde2a93f-kube-api-access-5lg8p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.332830 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.332907 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.333081 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.434512 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.434642 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.434675 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lg8p\" (UniqueName: \"kubernetes.io/projected/3ed1622b-fe84-4402-b15c-6971dde2a93f-kube-api-access-5lg8p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.434726 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.439933 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.442138 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.444738 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.458416 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lg8p\" (UniqueName: \"kubernetes.io/projected/3ed1622b-fe84-4402-b15c-6971dde2a93f-kube-api-access-5lg8p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:26 crc kubenswrapper[4852]: I1210 12:20:26.510154 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:20:27 crc kubenswrapper[4852]: I1210 12:20:27.064495 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp"] Dec 10 12:20:27 crc kubenswrapper[4852]: W1210 12:20:27.065594 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ed1622b_fe84_4402_b15c_6971dde2a93f.slice/crio-caaf1f09c930a053f5291b09758a305dfd0e6c344c290ee79c5d0bcbdafc47b2 WatchSource:0}: Error finding container caaf1f09c930a053f5291b09758a305dfd0e6c344c290ee79c5d0bcbdafc47b2: Status 404 returned error can't find the container with id caaf1f09c930a053f5291b09758a305dfd0e6c344c290ee79c5d0bcbdafc47b2 Dec 10 12:20:27 crc kubenswrapper[4852]: I1210 12:20:27.111893 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" event={"ID":"3ed1622b-fe84-4402-b15c-6971dde2a93f","Type":"ContainerStarted","Data":"caaf1f09c930a053f5291b09758a305dfd0e6c344c290ee79c5d0bcbdafc47b2"} Dec 10 12:20:27 crc kubenswrapper[4852]: I1210 12:20:27.170417 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:20:27 crc kubenswrapper[4852]: E1210 12:20:27.170740 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:20:29 crc kubenswrapper[4852]: I1210 12:20:29.135027 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" event={"ID":"3ed1622b-fe84-4402-b15c-6971dde2a93f","Type":"ContainerStarted","Data":"8dbd80cc0d864a54a4a21ac70cec1f2c07e042f5b484c73d98469632866ed7ef"} Dec 10 12:20:29 crc kubenswrapper[4852]: I1210 12:20:29.167748 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" podStartSLOduration=1.6759581909999999 podStartE2EDuration="3.167721595s" podCreationTimestamp="2025-12-10 12:20:26 +0000 UTC" firstStartedPulling="2025-12-10 12:20:27.068810513 +0000 UTC m=+1713.154335737" lastFinishedPulling="2025-12-10 12:20:28.560573927 +0000 UTC m=+1714.646099141" observedRunningTime="2025-12-10 12:20:29.159506419 +0000 UTC m=+1715.245031723" watchObservedRunningTime="2025-12-10 12:20:29.167721595 +0000 UTC m=+1715.253246849" Dec 10 12:20:40 crc kubenswrapper[4852]: I1210 12:20:40.170739 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:20:40 crc kubenswrapper[4852]: E1210 12:20:40.171539 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:20:55 crc kubenswrapper[4852]: I1210 12:20:55.169562 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:20:55 crc kubenswrapper[4852]: E1210 12:20:55.170320 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:21:08 crc kubenswrapper[4852]: I1210 12:21:08.170648 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:21:08 crc kubenswrapper[4852]: E1210 12:21:08.172660 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:21:20 crc kubenswrapper[4852]: I1210 12:21:20.170343 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:21:20 crc kubenswrapper[4852]: E1210 12:21:20.171139 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:21:34 crc kubenswrapper[4852]: I1210 12:21:34.179819 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:21:34 crc kubenswrapper[4852]: E1210 12:21:34.180934 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:21:49 crc kubenswrapper[4852]: I1210 12:21:49.169821 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:21:49 crc kubenswrapper[4852]: E1210 12:21:49.170904 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:22:03 crc kubenswrapper[4852]: I1210 12:22:03.431054 4852 scope.go:117] "RemoveContainer" containerID="c58b23ccaf518a2f72729195b40d6bca8271c7eacb17384f74613307f5af9fe1" Dec 10 12:22:03 crc kubenswrapper[4852]: I1210 12:22:03.459861 4852 scope.go:117] "RemoveContainer" containerID="6de144ae826874068a80869fb720cd775b1ff94ea940ae31fa46ff2b73124366" Dec 10 12:22:03 crc kubenswrapper[4852]: I1210 12:22:03.482382 4852 scope.go:117] "RemoveContainer" containerID="223fcf80ddc788967b643b71d2e171d53ef87db912e4ef2e32d508020ee33fbe" Dec 10 12:22:04 crc kubenswrapper[4852]: I1210 12:22:04.170631 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:22:04 crc kubenswrapper[4852]: E1210 12:22:04.171177 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:22:13 crc kubenswrapper[4852]: I1210 12:22:13.044517 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3a6c-account-create-update-gcs96"] Dec 10 12:22:13 crc kubenswrapper[4852]: I1210 12:22:13.099456 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4cwxn"] Dec 10 12:22:13 crc kubenswrapper[4852]: I1210 12:22:13.113005 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f1e3-account-create-update-kn7lg"] Dec 10 12:22:13 crc kubenswrapper[4852]: I1210 12:22:13.120369 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3a6c-account-create-update-gcs96"] Dec 10 12:22:13 crc kubenswrapper[4852]: I1210 12:22:13.145028 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qfvrl"] Dec 10 12:22:13 crc kubenswrapper[4852]: I1210 12:22:13.155947 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4cwxn"] Dec 10 12:22:13 crc kubenswrapper[4852]: I1210 12:22:13.166273 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f1e3-account-create-update-kn7lg"] Dec 10 12:22:13 crc kubenswrapper[4852]: I1210 12:22:13.175577 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qfvrl"] Dec 10 12:22:14 crc kubenswrapper[4852]: I1210 12:22:14.181411 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="220ec4be-95ca-4ace-967b-f7bf22c7d11a" path="/var/lib/kubelet/pods/220ec4be-95ca-4ace-967b-f7bf22c7d11a/volumes" Dec 10 12:22:14 crc kubenswrapper[4852]: I1210 12:22:14.183324 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f8ae564-c6ed-47c8-9952-f18311d280c5" path="/var/lib/kubelet/pods/2f8ae564-c6ed-47c8-9952-f18311d280c5/volumes" Dec 10 12:22:14 crc kubenswrapper[4852]: I1210 12:22:14.184103 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b019d2c6-723d-49ac-953b-a5b624876c5c" path="/var/lib/kubelet/pods/b019d2c6-723d-49ac-953b-a5b624876c5c/volumes" Dec 10 12:22:14 crc kubenswrapper[4852]: I1210 12:22:14.184900 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d008d61b-41dd-4c4d-be4c-d4a1de845bb5" path="/var/lib/kubelet/pods/d008d61b-41dd-4c4d-be4c-d4a1de845bb5/volumes" Dec 10 12:22:15 crc kubenswrapper[4852]: I1210 12:22:15.028576 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-hlx47"] Dec 10 12:22:15 crc kubenswrapper[4852]: I1210 12:22:15.040717 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-hlx47"] Dec 10 12:22:15 crc kubenswrapper[4852]: I1210 12:22:15.052526 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-aa09-account-create-update-2h8ln"] Dec 10 12:22:15 crc kubenswrapper[4852]: I1210 12:22:15.062514 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-aa09-account-create-update-2h8ln"] Dec 10 12:22:16 crc kubenswrapper[4852]: I1210 12:22:16.181134 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1316e219-f771-4b26-9329-4e110779b164" path="/var/lib/kubelet/pods/1316e219-f771-4b26-9329-4e110779b164/volumes" Dec 10 12:22:16 crc kubenswrapper[4852]: I1210 12:22:16.182862 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb9c92f-a03f-43e3-8f43-336e4236feee" path="/var/lib/kubelet/pods/dcb9c92f-a03f-43e3-8f43-336e4236feee/volumes" Dec 10 12:22:17 crc kubenswrapper[4852]: I1210 12:22:17.171163 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:22:17 crc kubenswrapper[4852]: E1210 12:22:17.171593 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:22:28 crc kubenswrapper[4852]: I1210 12:22:28.169882 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:22:28 crc kubenswrapper[4852]: E1210 12:22:28.170640 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:22:38 crc kubenswrapper[4852]: I1210 12:22:38.038849 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-vf7gw"] Dec 10 12:22:38 crc kubenswrapper[4852]: I1210 12:22:38.047785 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-vf7gw"] Dec 10 12:22:38 crc kubenswrapper[4852]: I1210 12:22:38.188123 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90efc41b-4c77-4a6b-bd78-544fd72b4078" path="/var/lib/kubelet/pods/90efc41b-4c77-4a6b-bd78-544fd72b4078/volumes" Dec 10 12:22:39 crc kubenswrapper[4852]: I1210 12:22:39.041392 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5kl56"] Dec 10 12:22:39 crc kubenswrapper[4852]: I1210 12:22:39.051287 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5kl56"] Dec 10 12:22:40 crc kubenswrapper[4852]: I1210 12:22:40.169888 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:22:40 crc kubenswrapper[4852]: E1210 12:22:40.170340 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:22:40 crc kubenswrapper[4852]: I1210 12:22:40.202423 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c123338-9441-4ded-a116-4d7d80f3032a" path="/var/lib/kubelet/pods/5c123338-9441-4ded-a116-4d7d80f3032a/volumes" Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.030901 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fvvwn"] Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.040516 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2aa1-account-create-update-x8g74"] Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.049721 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a82e-account-create-update-267jv"] Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.060020 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f122-account-create-update-zk7j7"] Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.067806 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fvvwn"] Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.074931 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2aa1-account-create-update-x8g74"] Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.084530 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a82e-account-create-update-267jv"] Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.091396 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f122-account-create-update-zk7j7"] Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.185323 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29dea678-28c8-44ea-857d-41437f6f9b24" path="/var/lib/kubelet/pods/29dea678-28c8-44ea-857d-41437f6f9b24/volumes" Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.185963 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659aefe7-f624-4b85-9c5c-a5aab3a1a95a" path="/var/lib/kubelet/pods/659aefe7-f624-4b85-9c5c-a5aab3a1a95a/volumes" Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.186659 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b687583-40f8-447f-b8fc-25fe8796f99a" path="/var/lib/kubelet/pods/9b687583-40f8-447f-b8fc-25fe8796f99a/volumes" Dec 10 12:22:44 crc kubenswrapper[4852]: I1210 12:22:44.187419 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b506fd-90e8-4807-877a-ebb8ec15f09f" path="/var/lib/kubelet/pods/d4b506fd-90e8-4807-877a-ebb8ec15f09f/volumes" Dec 10 12:22:47 crc kubenswrapper[4852]: I1210 12:22:47.030127 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8f6wn"] Dec 10 12:22:47 crc kubenswrapper[4852]: I1210 12:22:47.039402 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8f6wn"] Dec 10 12:22:48 crc kubenswrapper[4852]: I1210 12:22:48.182001 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf4bd98a-477c-4b9a-8cab-ccb69f404b1f" path="/var/lib/kubelet/pods/bf4bd98a-477c-4b9a-8cab-ccb69f404b1f/volumes" Dec 10 12:22:49 crc kubenswrapper[4852]: I1210 12:22:49.029767 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-645x7"] Dec 10 12:22:49 crc kubenswrapper[4852]: I1210 12:22:49.039965 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-645x7"] Dec 10 12:22:50 crc kubenswrapper[4852]: I1210 12:22:50.186676 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c84e037b-cf95-44a8-b0e5-b3b468a89166" path="/var/lib/kubelet/pods/c84e037b-cf95-44a8-b0e5-b3b468a89166/volumes" Dec 10 12:22:54 crc kubenswrapper[4852]: I1210 12:22:54.176511 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:22:54 crc kubenswrapper[4852]: E1210 12:22:54.177208 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.590830 4852 scope.go:117] "RemoveContainer" containerID="bff7c261363fc78a78427b3858cda8b95909a5e3fa7cb79344426b4dcbc773b2" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.621174 4852 scope.go:117] "RemoveContainer" containerID="a2dabaea4bb210129f40509462e06fb569608a9546eec76af81547c84af67341" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.691820 4852 scope.go:117] "RemoveContainer" containerID="4578bcf7c7353951318b77dfa1c7e841e4135f8d07001766c9b0a493d4fa9b69" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.738114 4852 scope.go:117] "RemoveContainer" containerID="1ffe03c76992cec29c6f69e23aa99890b376e1e78b78461f84806272db8dccb7" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.778192 4852 scope.go:117] "RemoveContainer" containerID="eaa96143faf031c6036d4b42e5d7cfb0636f82e5143483d529e280cc9bfcd620" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.818876 4852 scope.go:117] "RemoveContainer" containerID="5bc45a10e41ffb6cd0f3175e59b83e6eef74c5d06c4c7e9820d0d98800596681" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.861353 4852 scope.go:117] "RemoveContainer" containerID="b168317c146d94963b81d71b26279cae19d0b1d9039b615019aa13c7cf02241a" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.884533 4852 scope.go:117] "RemoveContainer" containerID="d0b3e7795546ac037395e937bf6b92120661c52eed7432cd66bdf290b39b19a3" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.940881 4852 scope.go:117] "RemoveContainer" containerID="71815324b55fbf56089a5d2f7f0ba6c1662438abf686b0a3e3c13f263a05652b" Dec 10 12:23:03 crc kubenswrapper[4852]: I1210 12:23:03.973378 4852 scope.go:117] "RemoveContainer" containerID="b8e136c591aa74b6312a40d51cf9eefec5f9a5a1c7b03164d17b0242ff78188a" Dec 10 12:23:04 crc kubenswrapper[4852]: I1210 12:23:04.017154 4852 scope.go:117] "RemoveContainer" containerID="fc8d70481fb0b61bba279033a8a92c3734ad340a913236591c8a59a160bc71b4" Dec 10 12:23:04 crc kubenswrapper[4852]: I1210 12:23:04.041225 4852 scope.go:117] "RemoveContainer" containerID="7924114f3b8bc59cc02f7b301f5475503cbe2cc3c6b41d08048c478a7e2caafe" Dec 10 12:23:04 crc kubenswrapper[4852]: I1210 12:23:04.102777 4852 scope.go:117] "RemoveContainer" containerID="79e9bcb12a3d7a77a06e17d9e6a118924c2191cec910cb5bb6d49f6a784f678c" Dec 10 12:23:04 crc kubenswrapper[4852]: I1210 12:23:04.122033 4852 scope.go:117] "RemoveContainer" containerID="7fe1bf6a1a2c8b9580c73582bce2d1629adc5fb021b1ab6db4cb9c2295f39451" Dec 10 12:23:04 crc kubenswrapper[4852]: I1210 12:23:04.145660 4852 scope.go:117] "RemoveContainer" containerID="1aa1a099437d331a2f218accc67f02364aacea8c5f8eb2392afe3c73fd40c48f" Dec 10 12:23:04 crc kubenswrapper[4852]: I1210 12:23:04.178481 4852 scope.go:117] "RemoveContainer" containerID="41036f6903dbcd535fe9cfd575b15e14f222db0f46936332e3592760ebcea532" Dec 10 12:23:04 crc kubenswrapper[4852]: I1210 12:23:04.203633 4852 scope.go:117] "RemoveContainer" containerID="3b205233875c7a02bfec5a7c14297fe9ef15b31d1f506db45e99af795747ca8f" Dec 10 12:23:08 crc kubenswrapper[4852]: I1210 12:23:08.170654 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:23:08 crc kubenswrapper[4852]: E1210 12:23:08.171318 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:23:21 crc kubenswrapper[4852]: I1210 12:23:21.169710 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:23:21 crc kubenswrapper[4852]: E1210 12:23:21.170467 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:23:32 crc kubenswrapper[4852]: I1210 12:23:32.170283 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:23:32 crc kubenswrapper[4852]: E1210 12:23:32.171612 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:23:38 crc kubenswrapper[4852]: I1210 12:23:38.124488 4852 generic.go:334] "Generic (PLEG): container finished" podID="3ed1622b-fe84-4402-b15c-6971dde2a93f" containerID="8dbd80cc0d864a54a4a21ac70cec1f2c07e042f5b484c73d98469632866ed7ef" exitCode=0 Dec 10 12:23:38 crc kubenswrapper[4852]: I1210 12:23:38.124589 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" event={"ID":"3ed1622b-fe84-4402-b15c-6971dde2a93f","Type":"ContainerDied","Data":"8dbd80cc0d864a54a4a21ac70cec1f2c07e042f5b484c73d98469632866ed7ef"} Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.057284 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bw8zp"] Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.067985 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ml9gs"] Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.084764 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bw8zp"] Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.093492 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ml9gs"] Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.565873 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.706179 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-bootstrap-combined-ca-bundle\") pod \"3ed1622b-fe84-4402-b15c-6971dde2a93f\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.706539 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-ssh-key\") pod \"3ed1622b-fe84-4402-b15c-6971dde2a93f\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.706646 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lg8p\" (UniqueName: \"kubernetes.io/projected/3ed1622b-fe84-4402-b15c-6971dde2a93f-kube-api-access-5lg8p\") pod \"3ed1622b-fe84-4402-b15c-6971dde2a93f\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.706760 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-inventory\") pod \"3ed1622b-fe84-4402-b15c-6971dde2a93f\" (UID: \"3ed1622b-fe84-4402-b15c-6971dde2a93f\") " Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.713044 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3ed1622b-fe84-4402-b15c-6971dde2a93f" (UID: "3ed1622b-fe84-4402-b15c-6971dde2a93f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.713343 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed1622b-fe84-4402-b15c-6971dde2a93f-kube-api-access-5lg8p" (OuterVolumeSpecName: "kube-api-access-5lg8p") pod "3ed1622b-fe84-4402-b15c-6971dde2a93f" (UID: "3ed1622b-fe84-4402-b15c-6971dde2a93f"). InnerVolumeSpecName "kube-api-access-5lg8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.735431 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3ed1622b-fe84-4402-b15c-6971dde2a93f" (UID: "3ed1622b-fe84-4402-b15c-6971dde2a93f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.744649 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-inventory" (OuterVolumeSpecName: "inventory") pod "3ed1622b-fe84-4402-b15c-6971dde2a93f" (UID: "3ed1622b-fe84-4402-b15c-6971dde2a93f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.809112 4852 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.809162 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.809177 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lg8p\" (UniqueName: \"kubernetes.io/projected/3ed1622b-fe84-4402-b15c-6971dde2a93f-kube-api-access-5lg8p\") on node \"crc\" DevicePath \"\"" Dec 10 12:23:39 crc kubenswrapper[4852]: I1210 12:23:39.809189 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ed1622b-fe84-4402-b15c-6971dde2a93f-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.145993 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" event={"ID":"3ed1622b-fe84-4402-b15c-6971dde2a93f","Type":"ContainerDied","Data":"caaf1f09c930a053f5291b09758a305dfd0e6c344c290ee79c5d0bcbdafc47b2"} Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.146037 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caaf1f09c930a053f5291b09758a305dfd0e6c344c290ee79c5d0bcbdafc47b2" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.146094 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.188324 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517ef493-1599-408d-bf6d-0e0eaef4d28c" path="/var/lib/kubelet/pods/517ef493-1599-408d-bf6d-0e0eaef4d28c/volumes" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.188978 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db4372d-41b4-4247-97ab-7f27026c2a82" path="/var/lib/kubelet/pods/5db4372d-41b4-4247-97ab-7f27026c2a82/volumes" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.233586 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm"] Dec 10 12:23:40 crc kubenswrapper[4852]: E1210 12:23:40.234189 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed1622b-fe84-4402-b15c-6971dde2a93f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.234273 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed1622b-fe84-4402-b15c-6971dde2a93f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.234484 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed1622b-fe84-4402-b15c-6971dde2a93f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.235271 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.237416 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.237797 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.237985 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.238217 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.249114 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm"] Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.317874 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.318334 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.318412 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqrdf\" (UniqueName: \"kubernetes.io/projected/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-kube-api-access-qqrdf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.419555 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.419656 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqrdf\" (UniqueName: \"kubernetes.io/projected/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-kube-api-access-qqrdf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.419756 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.423833 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.424090 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.453753 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqrdf\" (UniqueName: \"kubernetes.io/projected/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-kube-api-access-qqrdf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wrflm\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:40 crc kubenswrapper[4852]: I1210 12:23:40.557757 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:23:41 crc kubenswrapper[4852]: I1210 12:23:41.059980 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm"] Dec 10 12:23:41 crc kubenswrapper[4852]: I1210 12:23:41.158176 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" event={"ID":"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82","Type":"ContainerStarted","Data":"5a8ed80f67601fede501de519c4c710e84f955d5ef3b990e5ab3dab7fceab0f3"} Dec 10 12:23:42 crc kubenswrapper[4852]: I1210 12:23:42.180608 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" event={"ID":"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82","Type":"ContainerStarted","Data":"7aae4105c3cbb6f47349d3b709b45a33d91aa433010bae501f5e0879fa8ced6d"} Dec 10 12:23:42 crc kubenswrapper[4852]: I1210 12:23:42.197292 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" podStartSLOduration=1.664337822 podStartE2EDuration="2.197275045s" podCreationTimestamp="2025-12-10 12:23:40 +0000 UTC" firstStartedPulling="2025-12-10 12:23:41.065418639 +0000 UTC m=+1907.150943873" lastFinishedPulling="2025-12-10 12:23:41.598355872 +0000 UTC m=+1907.683881096" observedRunningTime="2025-12-10 12:23:42.186562857 +0000 UTC m=+1908.272088081" watchObservedRunningTime="2025-12-10 12:23:42.197275045 +0000 UTC m=+1908.282800269" Dec 10 12:23:44 crc kubenswrapper[4852]: I1210 12:23:44.179362 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:23:44 crc kubenswrapper[4852]: E1210 12:23:44.179936 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:23:46 crc kubenswrapper[4852]: I1210 12:23:46.038609 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-l7ltl"] Dec 10 12:23:46 crc kubenswrapper[4852]: I1210 12:23:46.049467 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-l7ltl"] Dec 10 12:23:46 crc kubenswrapper[4852]: I1210 12:23:46.184499 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0bf8197-7342-4289-ae56-f606b479778b" path="/var/lib/kubelet/pods/b0bf8197-7342-4289-ae56-f606b479778b/volumes" Dec 10 12:23:55 crc kubenswrapper[4852]: I1210 12:23:55.031893 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mwvfs"] Dec 10 12:23:55 crc kubenswrapper[4852]: I1210 12:23:55.042990 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mwvfs"] Dec 10 12:23:56 crc kubenswrapper[4852]: I1210 12:23:56.183199 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71453d1b-e7a6-44a3-a449-b1c10eb76997" path="/var/lib/kubelet/pods/71453d1b-e7a6-44a3-a449-b1c10eb76997/volumes" Dec 10 12:23:57 crc kubenswrapper[4852]: I1210 12:23:57.169964 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:23:57 crc kubenswrapper[4852]: E1210 12:23:57.170270 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:24:01 crc kubenswrapper[4852]: I1210 12:24:01.046121 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qwzww"] Dec 10 12:24:01 crc kubenswrapper[4852]: I1210 12:24:01.055875 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qwzww"] Dec 10 12:24:02 crc kubenswrapper[4852]: I1210 12:24:02.181935 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c" path="/var/lib/kubelet/pods/8cc9fcd2-20c3-439e-9b1a-45bc8fb10f8c/volumes" Dec 10 12:24:04 crc kubenswrapper[4852]: I1210 12:24:04.471113 4852 scope.go:117] "RemoveContainer" containerID="22ecf46a2bcdff4b0e33d8ff9c3cf00f444cde102496f3e4cc42c286b898cac4" Dec 10 12:24:04 crc kubenswrapper[4852]: I1210 12:24:04.521839 4852 scope.go:117] "RemoveContainer" containerID="cd1eb2d39a5971f12ea73ff19a309c97ebb40e08d5efb16be5f6e6223ce830db" Dec 10 12:24:04 crc kubenswrapper[4852]: I1210 12:24:04.565918 4852 scope.go:117] "RemoveContainer" containerID="fd502ff720569629780faae2475f0059b48c63ab992be0e2b745816d33a1238f" Dec 10 12:24:04 crc kubenswrapper[4852]: I1210 12:24:04.605108 4852 scope.go:117] "RemoveContainer" containerID="f5a0bcf57bba2b7b080f1839bce4a9e1453126b83a24288123ea4b72d204ad33" Dec 10 12:24:04 crc kubenswrapper[4852]: I1210 12:24:04.658540 4852 scope.go:117] "RemoveContainer" containerID="1118f5b103cf03423c8ff6da9869b2aff3c5c22afe380c5ff63466b3dd5aaedd" Dec 10 12:24:12 crc kubenswrapper[4852]: I1210 12:24:12.170617 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:24:12 crc kubenswrapper[4852]: E1210 12:24:12.171410 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.529809 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-snmrg"] Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.532567 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.545980 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snmrg"] Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.568801 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7x9v\" (UniqueName: \"kubernetes.io/projected/118205ea-f4de-4f56-a5f0-621fc2dbacba-kube-api-access-l7x9v\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.568870 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-catalog-content\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.568958 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-utilities\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.671388 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7x9v\" (UniqueName: \"kubernetes.io/projected/118205ea-f4de-4f56-a5f0-621fc2dbacba-kube-api-access-l7x9v\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.671458 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-catalog-content\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.671509 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-utilities\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.672040 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-catalog-content\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.672042 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-utilities\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.692043 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7x9v\" (UniqueName: \"kubernetes.io/projected/118205ea-f4de-4f56-a5f0-621fc2dbacba-kube-api-access-l7x9v\") pod \"redhat-marketplace-snmrg\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:22 crc kubenswrapper[4852]: I1210 12:24:22.855948 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:23 crc kubenswrapper[4852]: I1210 12:24:23.344809 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snmrg"] Dec 10 12:24:23 crc kubenswrapper[4852]: I1210 12:24:23.569153 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snmrg" event={"ID":"118205ea-f4de-4f56-a5f0-621fc2dbacba","Type":"ContainerStarted","Data":"3202a70b1b184efe3161ac65a4d8ceb875da1470b2da9bd8a64f31ceebef6928"} Dec 10 12:24:24 crc kubenswrapper[4852]: I1210 12:24:24.577913 4852 generic.go:334] "Generic (PLEG): container finished" podID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerID="16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57" exitCode=0 Dec 10 12:24:24 crc kubenswrapper[4852]: I1210 12:24:24.578003 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snmrg" event={"ID":"118205ea-f4de-4f56-a5f0-621fc2dbacba","Type":"ContainerDied","Data":"16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57"} Dec 10 12:24:26 crc kubenswrapper[4852]: I1210 12:24:26.183775 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:24:26 crc kubenswrapper[4852]: E1210 12:24:26.184689 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:24:26 crc kubenswrapper[4852]: I1210 12:24:26.599776 4852 generic.go:334] "Generic (PLEG): container finished" podID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerID="40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140" exitCode=0 Dec 10 12:24:26 crc kubenswrapper[4852]: I1210 12:24:26.599825 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snmrg" event={"ID":"118205ea-f4de-4f56-a5f0-621fc2dbacba","Type":"ContainerDied","Data":"40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140"} Dec 10 12:24:27 crc kubenswrapper[4852]: I1210 12:24:27.610167 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snmrg" event={"ID":"118205ea-f4de-4f56-a5f0-621fc2dbacba","Type":"ContainerStarted","Data":"61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8"} Dec 10 12:24:27 crc kubenswrapper[4852]: I1210 12:24:27.632798 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-snmrg" podStartSLOduration=2.91988857 podStartE2EDuration="5.632774751s" podCreationTimestamp="2025-12-10 12:24:22 +0000 UTC" firstStartedPulling="2025-12-10 12:24:24.58106985 +0000 UTC m=+1950.666595074" lastFinishedPulling="2025-12-10 12:24:27.293956021 +0000 UTC m=+1953.379481255" observedRunningTime="2025-12-10 12:24:27.627216352 +0000 UTC m=+1953.712741586" watchObservedRunningTime="2025-12-10 12:24:27.632774751 +0000 UTC m=+1953.718299985" Dec 10 12:24:32 crc kubenswrapper[4852]: I1210 12:24:32.856417 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:32 crc kubenswrapper[4852]: I1210 12:24:32.856997 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:32 crc kubenswrapper[4852]: I1210 12:24:32.906082 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:33 crc kubenswrapper[4852]: I1210 12:24:33.706554 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:33 crc kubenswrapper[4852]: I1210 12:24:33.759575 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snmrg"] Dec 10 12:24:35 crc kubenswrapper[4852]: I1210 12:24:35.686353 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-snmrg" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerName="registry-server" containerID="cri-o://61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8" gracePeriod=2 Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.195487 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.343645 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-catalog-content\") pod \"118205ea-f4de-4f56-a5f0-621fc2dbacba\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.343851 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-utilities\") pod \"118205ea-f4de-4f56-a5f0-621fc2dbacba\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.343918 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7x9v\" (UniqueName: \"kubernetes.io/projected/118205ea-f4de-4f56-a5f0-621fc2dbacba-kube-api-access-l7x9v\") pod \"118205ea-f4de-4f56-a5f0-621fc2dbacba\" (UID: \"118205ea-f4de-4f56-a5f0-621fc2dbacba\") " Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.344796 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-utilities" (OuterVolumeSpecName: "utilities") pod "118205ea-f4de-4f56-a5f0-621fc2dbacba" (UID: "118205ea-f4de-4f56-a5f0-621fc2dbacba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.349775 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118205ea-f4de-4f56-a5f0-621fc2dbacba-kube-api-access-l7x9v" (OuterVolumeSpecName: "kube-api-access-l7x9v") pod "118205ea-f4de-4f56-a5f0-621fc2dbacba" (UID: "118205ea-f4de-4f56-a5f0-621fc2dbacba"). InnerVolumeSpecName "kube-api-access-l7x9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.363957 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "118205ea-f4de-4f56-a5f0-621fc2dbacba" (UID: "118205ea-f4de-4f56-a5f0-621fc2dbacba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.446324 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.446354 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7x9v\" (UniqueName: \"kubernetes.io/projected/118205ea-f4de-4f56-a5f0-621fc2dbacba-kube-api-access-l7x9v\") on node \"crc\" DevicePath \"\"" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.446365 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118205ea-f4de-4f56-a5f0-621fc2dbacba-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.697268 4852 generic.go:334] "Generic (PLEG): container finished" podID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerID="61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8" exitCode=0 Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.697313 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snmrg" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.697316 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snmrg" event={"ID":"118205ea-f4de-4f56-a5f0-621fc2dbacba","Type":"ContainerDied","Data":"61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8"} Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.697434 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snmrg" event={"ID":"118205ea-f4de-4f56-a5f0-621fc2dbacba","Type":"ContainerDied","Data":"3202a70b1b184efe3161ac65a4d8ceb875da1470b2da9bd8a64f31ceebef6928"} Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.697452 4852 scope.go:117] "RemoveContainer" containerID="61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.721476 4852 scope.go:117] "RemoveContainer" containerID="40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.732527 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snmrg"] Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.741525 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-snmrg"] Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.765158 4852 scope.go:117] "RemoveContainer" containerID="16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.795042 4852 scope.go:117] "RemoveContainer" containerID="61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8" Dec 10 12:24:36 crc kubenswrapper[4852]: E1210 12:24:36.795651 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8\": container with ID starting with 61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8 not found: ID does not exist" containerID="61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.795692 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8"} err="failed to get container status \"61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8\": rpc error: code = NotFound desc = could not find container \"61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8\": container with ID starting with 61e1ea1805ed96763c554307cd81a14ea5558326dd5d56bc158f1a00e0147ad8 not found: ID does not exist" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.795719 4852 scope.go:117] "RemoveContainer" containerID="40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140" Dec 10 12:24:36 crc kubenswrapper[4852]: E1210 12:24:36.796213 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140\": container with ID starting with 40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140 not found: ID does not exist" containerID="40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.796272 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140"} err="failed to get container status \"40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140\": rpc error: code = NotFound desc = could not find container \"40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140\": container with ID starting with 40b48a265e12d35e7d0617d55db52d97087598759ed7b65d93ccc53a50ec1140 not found: ID does not exist" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.796299 4852 scope.go:117] "RemoveContainer" containerID="16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57" Dec 10 12:24:36 crc kubenswrapper[4852]: E1210 12:24:36.797046 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57\": container with ID starting with 16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57 not found: ID does not exist" containerID="16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57" Dec 10 12:24:36 crc kubenswrapper[4852]: I1210 12:24:36.797481 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57"} err="failed to get container status \"16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57\": rpc error: code = NotFound desc = could not find container \"16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57\": container with ID starting with 16e406c02c544fe0cbdafa4aa65164242333e46b4a8fb30e00ea5db2e4714c57 not found: ID does not exist" Dec 10 12:24:37 crc kubenswrapper[4852]: I1210 12:24:37.170257 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:24:37 crc kubenswrapper[4852]: E1210 12:24:37.170775 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:24:38 crc kubenswrapper[4852]: I1210 12:24:38.209160 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" path="/var/lib/kubelet/pods/118205ea-f4de-4f56-a5f0-621fc2dbacba/volumes" Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.058280 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mr6q4"] Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.070359 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qm45f"] Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.078095 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2f56-account-create-update-mhhlb"] Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.085556 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kbz4r"] Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.092538 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2f56-account-create-update-mhhlb"] Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.114657 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mr6q4"] Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.114723 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kbz4r"] Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.125441 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qm45f"] Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.181797 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e83be9-e570-4fe9-8a2d-e5a6fa941c24" path="/var/lib/kubelet/pods/41e83be9-e570-4fe9-8a2d-e5a6fa941c24/volumes" Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.182646 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4df457a9-8045-4c39-abe3-31afc98aaa26" path="/var/lib/kubelet/pods/4df457a9-8045-4c39-abe3-31afc98aaa26/volumes" Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.183399 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836ad386-beab-46db-b9a0-8c31ce0791ec" path="/var/lib/kubelet/pods/836ad386-beab-46db-b9a0-8c31ce0791ec/volumes" Dec 10 12:24:42 crc kubenswrapper[4852]: I1210 12:24:42.184152 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3f835c-31cf-4330-8951-9a1e5414b839" path="/var/lib/kubelet/pods/9c3f835c-31cf-4330-8951-9a1e5414b839/volumes" Dec 10 12:24:43 crc kubenswrapper[4852]: I1210 12:24:43.032312 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d986-account-create-update-hhxgf"] Dec 10 12:24:43 crc kubenswrapper[4852]: I1210 12:24:43.040050 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d986-account-create-update-hhxgf"] Dec 10 12:24:43 crc kubenswrapper[4852]: I1210 12:24:43.048574 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2c6a-account-create-update-tnhvt"] Dec 10 12:24:43 crc kubenswrapper[4852]: I1210 12:24:43.056306 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2c6a-account-create-update-tnhvt"] Dec 10 12:24:44 crc kubenswrapper[4852]: I1210 12:24:44.182411 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008fada8-afb1-41ff-bd3a-06820f79cee6" path="/var/lib/kubelet/pods/008fada8-afb1-41ff-bd3a-06820f79cee6/volumes" Dec 10 12:24:44 crc kubenswrapper[4852]: I1210 12:24:44.183213 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3c26f6c-540e-46cf-abb4-48905651f901" path="/var/lib/kubelet/pods/b3c26f6c-540e-46cf-abb4-48905651f901/volumes" Dec 10 12:24:50 crc kubenswrapper[4852]: I1210 12:24:50.170640 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:24:50 crc kubenswrapper[4852]: E1210 12:24:50.171448 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:25:03 crc kubenswrapper[4852]: I1210 12:25:03.169944 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:25:03 crc kubenswrapper[4852]: E1210 12:25:03.170802 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:25:04 crc kubenswrapper[4852]: I1210 12:25:04.788415 4852 scope.go:117] "RemoveContainer" containerID="bca81ee791945c82d000819a72940012d8a3904b58c7709bd3fdcc688ad548ae" Dec 10 12:25:04 crc kubenswrapper[4852]: I1210 12:25:04.821890 4852 scope.go:117] "RemoveContainer" containerID="4a06eb7c8ba9dc9c3071d49362dad47c119e26697c724fed172a559e0edcf793" Dec 10 12:25:04 crc kubenswrapper[4852]: I1210 12:25:04.871102 4852 scope.go:117] "RemoveContainer" containerID="965acc844e6a755369fb439ec0735e5b164260faf98a715206debc22f5f38d10" Dec 10 12:25:04 crc kubenswrapper[4852]: I1210 12:25:04.910071 4852 scope.go:117] "RemoveContainer" containerID="f3d341118041b7819d874f7b71b76002aad445ab9373bc106672f06ed4186b9f" Dec 10 12:25:04 crc kubenswrapper[4852]: I1210 12:25:04.963040 4852 scope.go:117] "RemoveContainer" containerID="60d633394812bf8f700c533e5f47f001ffa1826ac3b9408dab6399e46836be4b" Dec 10 12:25:05 crc kubenswrapper[4852]: I1210 12:25:05.026463 4852 scope.go:117] "RemoveContainer" containerID="6cad90b9bf867a69fb6135afc6bbfa70fa2487292751ced5187d378666904193" Dec 10 12:25:17 crc kubenswrapper[4852]: I1210 12:25:17.169826 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:25:18 crc kubenswrapper[4852]: I1210 12:25:18.065104 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"f6352925c4a9867f9532e5b81e9bae6d86319db1a6b4361b34333b9d6c861d17"} Dec 10 12:25:35 crc kubenswrapper[4852]: I1210 12:25:35.047464 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7h6"] Dec 10 12:25:35 crc kubenswrapper[4852]: I1210 12:25:35.055616 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7h6"] Dec 10 12:25:36 crc kubenswrapper[4852]: I1210 12:25:36.181717 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59c1e79-8117-43c8-bd38-0f1f144271a5" path="/var/lib/kubelet/pods/f59c1e79-8117-43c8-bd38-0f1f144271a5/volumes" Dec 10 12:25:43 crc kubenswrapper[4852]: I1210 12:25:43.541783 4852 generic.go:334] "Generic (PLEG): container finished" podID="abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82" containerID="7aae4105c3cbb6f47349d3b709b45a33d91aa433010bae501f5e0879fa8ced6d" exitCode=0 Dec 10 12:25:43 crc kubenswrapper[4852]: I1210 12:25:43.541895 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" event={"ID":"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82","Type":"ContainerDied","Data":"7aae4105c3cbb6f47349d3b709b45a33d91aa433010bae501f5e0879fa8ced6d"} Dec 10 12:25:44 crc kubenswrapper[4852]: I1210 12:25:44.947736 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.010685 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-inventory\") pod \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.011056 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-ssh-key\") pod \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.011434 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqrdf\" (UniqueName: \"kubernetes.io/projected/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-kube-api-access-qqrdf\") pod \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\" (UID: \"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82\") " Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.017450 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-kube-api-access-qqrdf" (OuterVolumeSpecName: "kube-api-access-qqrdf") pod "abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82" (UID: "abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82"). InnerVolumeSpecName "kube-api-access-qqrdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.045478 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-inventory" (OuterVolumeSpecName: "inventory") pod "abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82" (UID: "abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.050287 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82" (UID: "abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.114155 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.114183 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqrdf\" (UniqueName: \"kubernetes.io/projected/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-kube-api-access-qqrdf\") on node \"crc\" DevicePath \"\"" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.114195 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.560355 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" event={"ID":"abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82","Type":"ContainerDied","Data":"5a8ed80f67601fede501de519c4c710e84f955d5ef3b990e5ab3dab7fceab0f3"} Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.560400 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a8ed80f67601fede501de519c4c710e84f955d5ef3b990e5ab3dab7fceab0f3" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.560450 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wrflm" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.639562 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2"] Dec 10 12:25:45 crc kubenswrapper[4852]: E1210 12:25:45.640118 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerName="registry-server" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.640139 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerName="registry-server" Dec 10 12:25:45 crc kubenswrapper[4852]: E1210 12:25:45.640149 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.640159 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 10 12:25:45 crc kubenswrapper[4852]: E1210 12:25:45.640176 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerName="extract-content" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.640184 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerName="extract-content" Dec 10 12:25:45 crc kubenswrapper[4852]: E1210 12:25:45.640241 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerName="extract-utilities" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.640251 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerName="extract-utilities" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.640460 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="118205ea-f4de-4f56-a5f0-621fc2dbacba" containerName="registry-server" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.640482 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.641355 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.643348 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.643658 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.643733 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.644049 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.668685 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2"] Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.727083 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbjg\" (UniqueName: \"kubernetes.io/projected/5d8bf94c-e162-497e-8f35-6171e96384a3-kube-api-access-lhbjg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.727276 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.727321 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.829209 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.829302 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.829427 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhbjg\" (UniqueName: \"kubernetes.io/projected/5d8bf94c-e162-497e-8f35-6171e96384a3-kube-api-access-lhbjg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.833957 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.837389 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.848294 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhbjg\" (UniqueName: \"kubernetes.io/projected/5d8bf94c-e162-497e-8f35-6171e96384a3-kube-api-access-lhbjg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:45 crc kubenswrapper[4852]: I1210 12:25:45.962827 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:25:46 crc kubenswrapper[4852]: I1210 12:25:46.482931 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:25:46 crc kubenswrapper[4852]: I1210 12:25:46.486444 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2"] Dec 10 12:25:46 crc kubenswrapper[4852]: I1210 12:25:46.572691 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" event={"ID":"5d8bf94c-e162-497e-8f35-6171e96384a3","Type":"ContainerStarted","Data":"7c114cc253be9d8c02b61e6c39d29d722f66fa2f463bfeb606f14b50d150de95"} Dec 10 12:25:48 crc kubenswrapper[4852]: I1210 12:25:48.592330 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" event={"ID":"5d8bf94c-e162-497e-8f35-6171e96384a3","Type":"ContainerStarted","Data":"34bbd39568291d2c9cc07cb2f372ed7183a5d5a8b0a74e2ab6a6db33282ecb30"} Dec 10 12:25:48 crc kubenswrapper[4852]: I1210 12:25:48.616038 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" podStartSLOduration=2.617782388 podStartE2EDuration="3.616018584s" podCreationTimestamp="2025-12-10 12:25:45 +0000 UTC" firstStartedPulling="2025-12-10 12:25:46.48262371 +0000 UTC m=+2032.568148934" lastFinishedPulling="2025-12-10 12:25:47.480859906 +0000 UTC m=+2033.566385130" observedRunningTime="2025-12-10 12:25:48.606634439 +0000 UTC m=+2034.692159663" watchObservedRunningTime="2025-12-10 12:25:48.616018584 +0000 UTC m=+2034.701543808" Dec 10 12:25:58 crc kubenswrapper[4852]: I1210 12:25:58.036654 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-svlcr"] Dec 10 12:25:58 crc kubenswrapper[4852]: I1210 12:25:58.045460 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-svlcr"] Dec 10 12:25:58 crc kubenswrapper[4852]: I1210 12:25:58.183061 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c50ff1-4991-48d8-9b54-f49f711fffb6" path="/var/lib/kubelet/pods/71c50ff1-4991-48d8-9b54-f49f711fffb6/volumes" Dec 10 12:26:05 crc kubenswrapper[4852]: I1210 12:26:05.170173 4852 scope.go:117] "RemoveContainer" containerID="a1137ef0204eadd0d77af87557be669f16776973d46f520e5a48094514ced9b2" Dec 10 12:26:05 crc kubenswrapper[4852]: I1210 12:26:05.241161 4852 scope.go:117] "RemoveContainer" containerID="e7f127cfc23767857e99bf4d219bdc65b7ff545da4bc7a33f028fff5ba930b28" Dec 10 12:26:10 crc kubenswrapper[4852]: I1210 12:26:10.037251 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b68mm"] Dec 10 12:26:10 crc kubenswrapper[4852]: I1210 12:26:10.046659 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b68mm"] Dec 10 12:26:10 crc kubenswrapper[4852]: I1210 12:26:10.180360 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eefb3c9-b84f-4ce8-9b0e-80e3187c7902" path="/var/lib/kubelet/pods/1eefb3c9-b84f-4ce8-9b0e-80e3187c7902/volumes" Dec 10 12:26:45 crc kubenswrapper[4852]: I1210 12:26:45.058295 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-r97dm"] Dec 10 12:26:45 crc kubenswrapper[4852]: I1210 12:26:45.067407 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-r97dm"] Dec 10 12:26:46 crc kubenswrapper[4852]: I1210 12:26:46.181771 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481d8815-c8ec-4eeb-aad1-bb28f7161829" path="/var/lib/kubelet/pods/481d8815-c8ec-4eeb-aad1-bb28f7161829/volumes" Dec 10 12:27:05 crc kubenswrapper[4852]: I1210 12:27:05.317892 4852 scope.go:117] "RemoveContainer" containerID="66e553cae7591ba1c5daa51fce62c0e4f906c9fa7ead5729b146abf1dc87472d" Dec 10 12:27:05 crc kubenswrapper[4852]: I1210 12:27:05.367918 4852 scope.go:117] "RemoveContainer" containerID="179a290e11ba989f9e7d5abcf33d2c557dc0b52bf97756e917afbd70d4feb2c9" Dec 10 12:27:08 crc kubenswrapper[4852]: I1210 12:27:08.335741 4852 generic.go:334] "Generic (PLEG): container finished" podID="5d8bf94c-e162-497e-8f35-6171e96384a3" containerID="34bbd39568291d2c9cc07cb2f372ed7183a5d5a8b0a74e2ab6a6db33282ecb30" exitCode=0 Dec 10 12:27:08 crc kubenswrapper[4852]: I1210 12:27:08.335839 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" event={"ID":"5d8bf94c-e162-497e-8f35-6171e96384a3","Type":"ContainerDied","Data":"34bbd39568291d2c9cc07cb2f372ed7183a5d5a8b0a74e2ab6a6db33282ecb30"} Dec 10 12:27:09 crc kubenswrapper[4852]: I1210 12:27:09.789441 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:27:09 crc kubenswrapper[4852]: I1210 12:27:09.925543 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhbjg\" (UniqueName: \"kubernetes.io/projected/5d8bf94c-e162-497e-8f35-6171e96384a3-kube-api-access-lhbjg\") pod \"5d8bf94c-e162-497e-8f35-6171e96384a3\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " Dec 10 12:27:09 crc kubenswrapper[4852]: I1210 12:27:09.925619 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-inventory\") pod \"5d8bf94c-e162-497e-8f35-6171e96384a3\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " Dec 10 12:27:09 crc kubenswrapper[4852]: I1210 12:27:09.925739 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-ssh-key\") pod \"5d8bf94c-e162-497e-8f35-6171e96384a3\" (UID: \"5d8bf94c-e162-497e-8f35-6171e96384a3\") " Dec 10 12:27:09 crc kubenswrapper[4852]: I1210 12:27:09.931183 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8bf94c-e162-497e-8f35-6171e96384a3-kube-api-access-lhbjg" (OuterVolumeSpecName: "kube-api-access-lhbjg") pod "5d8bf94c-e162-497e-8f35-6171e96384a3" (UID: "5d8bf94c-e162-497e-8f35-6171e96384a3"). InnerVolumeSpecName "kube-api-access-lhbjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:27:09 crc kubenswrapper[4852]: I1210 12:27:09.952802 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-inventory" (OuterVolumeSpecName: "inventory") pod "5d8bf94c-e162-497e-8f35-6171e96384a3" (UID: "5d8bf94c-e162-497e-8f35-6171e96384a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:27:09 crc kubenswrapper[4852]: I1210 12:27:09.962015 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5d8bf94c-e162-497e-8f35-6171e96384a3" (UID: "5d8bf94c-e162-497e-8f35-6171e96384a3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.027654 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.027684 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhbjg\" (UniqueName: \"kubernetes.io/projected/5d8bf94c-e162-497e-8f35-6171e96384a3-kube-api-access-lhbjg\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.027695 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8bf94c-e162-497e-8f35-6171e96384a3-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.354937 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" event={"ID":"5d8bf94c-e162-497e-8f35-6171e96384a3","Type":"ContainerDied","Data":"7c114cc253be9d8c02b61e6c39d29d722f66fa2f463bfeb606f14b50d150de95"} Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.354986 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c114cc253be9d8c02b61e6c39d29d722f66fa2f463bfeb606f14b50d150de95" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.355000 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.442791 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb"] Dec 10 12:27:10 crc kubenswrapper[4852]: E1210 12:27:10.443136 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8bf94c-e162-497e-8f35-6171e96384a3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.443153 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8bf94c-e162-497e-8f35-6171e96384a3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.443419 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8bf94c-e162-497e-8f35-6171e96384a3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.444063 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.445540 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.446320 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.446545 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.449441 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.457151 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb"] Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.535360 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.535408 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bg2\" (UniqueName: \"kubernetes.io/projected/c8130005-2302-4ea1-8677-b590a256d3ec-kube-api-access-k5bg2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.535480 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.636968 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.637012 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bg2\" (UniqueName: \"kubernetes.io/projected/c8130005-2302-4ea1-8677-b590a256d3ec-kube-api-access-k5bg2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.637057 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.641292 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.646537 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.656472 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bg2\" (UniqueName: \"kubernetes.io/projected/c8130005-2302-4ea1-8677-b590a256d3ec-kube-api-access-k5bg2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:10 crc kubenswrapper[4852]: I1210 12:27:10.763180 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:11 crc kubenswrapper[4852]: I1210 12:27:11.437206 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb"] Dec 10 12:27:11 crc kubenswrapper[4852]: W1210 12:27:11.441880 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8130005_2302_4ea1_8677_b590a256d3ec.slice/crio-3c84c876b77bea0113eb9322caeb61881a1eaa9ea8864def9c86eb473c754a67 WatchSource:0}: Error finding container 3c84c876b77bea0113eb9322caeb61881a1eaa9ea8864def9c86eb473c754a67: Status 404 returned error can't find the container with id 3c84c876b77bea0113eb9322caeb61881a1eaa9ea8864def9c86eb473c754a67 Dec 10 12:27:12 crc kubenswrapper[4852]: I1210 12:27:12.372859 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" event={"ID":"c8130005-2302-4ea1-8677-b590a256d3ec","Type":"ContainerStarted","Data":"c07c45e28a18d9651200632bbd85aced74798c799f10319bd69fb1d464c5ac26"} Dec 10 12:27:12 crc kubenswrapper[4852]: I1210 12:27:12.373301 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" event={"ID":"c8130005-2302-4ea1-8677-b590a256d3ec","Type":"ContainerStarted","Data":"3c84c876b77bea0113eb9322caeb61881a1eaa9ea8864def9c86eb473c754a67"} Dec 10 12:27:12 crc kubenswrapper[4852]: I1210 12:27:12.388683 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" podStartSLOduration=1.71928918 podStartE2EDuration="2.388664274s" podCreationTimestamp="2025-12-10 12:27:10 +0000 UTC" firstStartedPulling="2025-12-10 12:27:11.443834443 +0000 UTC m=+2117.529359667" lastFinishedPulling="2025-12-10 12:27:12.113209537 +0000 UTC m=+2118.198734761" observedRunningTime="2025-12-10 12:27:12.384736766 +0000 UTC m=+2118.470261980" watchObservedRunningTime="2025-12-10 12:27:12.388664274 +0000 UTC m=+2118.474189498" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.453517 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-crjpj"] Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.455984 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.465291 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crjpj"] Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.542834 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-utilities\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.543118 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-catalog-content\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.543245 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrk8\" (UniqueName: \"kubernetes.io/projected/df462bd0-70ee-4548-9946-edf2f621c14d-kube-api-access-llrk8\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.645680 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-catalog-content\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.645762 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llrk8\" (UniqueName: \"kubernetes.io/projected/df462bd0-70ee-4548-9946-edf2f621c14d-kube-api-access-llrk8\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.645853 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-utilities\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.646443 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-catalog-content\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.646447 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-utilities\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.680556 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrk8\" (UniqueName: \"kubernetes.io/projected/df462bd0-70ee-4548-9946-edf2f621c14d-kube-api-access-llrk8\") pod \"redhat-operators-crjpj\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:14 crc kubenswrapper[4852]: I1210 12:27:14.774833 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:15 crc kubenswrapper[4852]: I1210 12:27:15.239858 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crjpj"] Dec 10 12:27:15 crc kubenswrapper[4852]: I1210 12:27:15.403035 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crjpj" event={"ID":"df462bd0-70ee-4548-9946-edf2f621c14d","Type":"ContainerStarted","Data":"fba742e496b494edb95df3a516077c61e05e33271f9d11a450f3b946a3fcb05a"} Dec 10 12:27:16 crc kubenswrapper[4852]: I1210 12:27:16.412083 4852 generic.go:334] "Generic (PLEG): container finished" podID="df462bd0-70ee-4548-9946-edf2f621c14d" containerID="743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e" exitCode=0 Dec 10 12:27:16 crc kubenswrapper[4852]: I1210 12:27:16.412527 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crjpj" event={"ID":"df462bd0-70ee-4548-9946-edf2f621c14d","Type":"ContainerDied","Data":"743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e"} Dec 10 12:27:17 crc kubenswrapper[4852]: I1210 12:27:17.427622 4852 generic.go:334] "Generic (PLEG): container finished" podID="c8130005-2302-4ea1-8677-b590a256d3ec" containerID="c07c45e28a18d9651200632bbd85aced74798c799f10319bd69fb1d464c5ac26" exitCode=0 Dec 10 12:27:17 crc kubenswrapper[4852]: I1210 12:27:17.427693 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" event={"ID":"c8130005-2302-4ea1-8677-b590a256d3ec","Type":"ContainerDied","Data":"c07c45e28a18d9651200632bbd85aced74798c799f10319bd69fb1d464c5ac26"} Dec 10 12:27:18 crc kubenswrapper[4852]: I1210 12:27:18.438425 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crjpj" event={"ID":"df462bd0-70ee-4548-9946-edf2f621c14d","Type":"ContainerStarted","Data":"85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad"} Dec 10 12:27:18 crc kubenswrapper[4852]: I1210 12:27:18.890989 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.032769 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-ssh-key\") pod \"c8130005-2302-4ea1-8677-b590a256d3ec\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.032828 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5bg2\" (UniqueName: \"kubernetes.io/projected/c8130005-2302-4ea1-8677-b590a256d3ec-kube-api-access-k5bg2\") pod \"c8130005-2302-4ea1-8677-b590a256d3ec\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.032910 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-inventory\") pod \"c8130005-2302-4ea1-8677-b590a256d3ec\" (UID: \"c8130005-2302-4ea1-8677-b590a256d3ec\") " Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.039054 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8130005-2302-4ea1-8677-b590a256d3ec-kube-api-access-k5bg2" (OuterVolumeSpecName: "kube-api-access-k5bg2") pod "c8130005-2302-4ea1-8677-b590a256d3ec" (UID: "c8130005-2302-4ea1-8677-b590a256d3ec"). InnerVolumeSpecName "kube-api-access-k5bg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.063280 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c8130005-2302-4ea1-8677-b590a256d3ec" (UID: "c8130005-2302-4ea1-8677-b590a256d3ec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.068351 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-inventory" (OuterVolumeSpecName: "inventory") pod "c8130005-2302-4ea1-8677-b590a256d3ec" (UID: "c8130005-2302-4ea1-8677-b590a256d3ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.135053 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5bg2\" (UniqueName: \"kubernetes.io/projected/c8130005-2302-4ea1-8677-b590a256d3ec-kube-api-access-k5bg2\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.135089 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.135098 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8130005-2302-4ea1-8677-b590a256d3ec-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.446466 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" event={"ID":"c8130005-2302-4ea1-8677-b590a256d3ec","Type":"ContainerDied","Data":"3c84c876b77bea0113eb9322caeb61881a1eaa9ea8864def9c86eb473c754a67"} Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.446501 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.446515 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c84c876b77bea0113eb9322caeb61881a1eaa9ea8864def9c86eb473c754a67" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.530656 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn"] Dec 10 12:27:19 crc kubenswrapper[4852]: E1210 12:27:19.531476 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8130005-2302-4ea1-8677-b590a256d3ec" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.531504 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8130005-2302-4ea1-8677-b590a256d3ec" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.531737 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8130005-2302-4ea1-8677-b590a256d3ec" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.532413 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.534644 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.534724 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.536142 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.538180 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.541201 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn"] Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.643812 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.644174 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.644214 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t48nd\" (UniqueName: \"kubernetes.io/projected/40acc70f-2b91-4e6e-af47-b525289badc8-kube-api-access-t48nd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.745939 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.746363 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.746457 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t48nd\" (UniqueName: \"kubernetes.io/projected/40acc70f-2b91-4e6e-af47-b525289badc8-kube-api-access-t48nd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.749887 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.750293 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.764502 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t48nd\" (UniqueName: \"kubernetes.io/projected/40acc70f-2b91-4e6e-af47-b525289badc8-kube-api-access-t48nd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rh9qn\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:19 crc kubenswrapper[4852]: I1210 12:27:19.847671 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:27:20 crc kubenswrapper[4852]: I1210 12:27:20.455618 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn"] Dec 10 12:27:20 crc kubenswrapper[4852]: I1210 12:27:20.458418 4852 generic.go:334] "Generic (PLEG): container finished" podID="df462bd0-70ee-4548-9946-edf2f621c14d" containerID="85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad" exitCode=0 Dec 10 12:27:20 crc kubenswrapper[4852]: I1210 12:27:20.458460 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crjpj" event={"ID":"df462bd0-70ee-4548-9946-edf2f621c14d","Type":"ContainerDied","Data":"85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad"} Dec 10 12:27:20 crc kubenswrapper[4852]: W1210 12:27:20.461861 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40acc70f_2b91_4e6e_af47_b525289badc8.slice/crio-17c9649118306d47dd542353c78bf3d27c01f26175a51b846b5ab1f57d080ddc WatchSource:0}: Error finding container 17c9649118306d47dd542353c78bf3d27c01f26175a51b846b5ab1f57d080ddc: Status 404 returned error can't find the container with id 17c9649118306d47dd542353c78bf3d27c01f26175a51b846b5ab1f57d080ddc Dec 10 12:27:21 crc kubenswrapper[4852]: I1210 12:27:21.468002 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" event={"ID":"40acc70f-2b91-4e6e-af47-b525289badc8","Type":"ContainerStarted","Data":"17c9649118306d47dd542353c78bf3d27c01f26175a51b846b5ab1f57d080ddc"} Dec 10 12:27:23 crc kubenswrapper[4852]: I1210 12:27:23.486055 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crjpj" event={"ID":"df462bd0-70ee-4548-9946-edf2f621c14d","Type":"ContainerStarted","Data":"bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909"} Dec 10 12:27:23 crc kubenswrapper[4852]: I1210 12:27:23.487468 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" event={"ID":"40acc70f-2b91-4e6e-af47-b525289badc8","Type":"ContainerStarted","Data":"9763ecb75e1b9fab2685f05209c3e8b1d0ae67e24dec00be8b9f125c7a80132a"} Dec 10 12:27:23 crc kubenswrapper[4852]: I1210 12:27:23.508210 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-crjpj" podStartSLOduration=3.026399255 podStartE2EDuration="9.508188397s" podCreationTimestamp="2025-12-10 12:27:14 +0000 UTC" firstStartedPulling="2025-12-10 12:27:16.414221671 +0000 UTC m=+2122.499746895" lastFinishedPulling="2025-12-10 12:27:22.896010803 +0000 UTC m=+2128.981536037" observedRunningTime="2025-12-10 12:27:23.505063209 +0000 UTC m=+2129.590588443" watchObservedRunningTime="2025-12-10 12:27:23.508188397 +0000 UTC m=+2129.593713621" Dec 10 12:27:23 crc kubenswrapper[4852]: I1210 12:27:23.524568 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" podStartSLOduration=1.815578304 podStartE2EDuration="4.524550657s" podCreationTimestamp="2025-12-10 12:27:19 +0000 UTC" firstStartedPulling="2025-12-10 12:27:20.467197154 +0000 UTC m=+2126.552722378" lastFinishedPulling="2025-12-10 12:27:23.176169487 +0000 UTC m=+2129.261694731" observedRunningTime="2025-12-10 12:27:23.518954377 +0000 UTC m=+2129.604479621" watchObservedRunningTime="2025-12-10 12:27:23.524550657 +0000 UTC m=+2129.610075881" Dec 10 12:27:24 crc kubenswrapper[4852]: I1210 12:27:24.775198 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:24 crc kubenswrapper[4852]: I1210 12:27:24.775631 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:25 crc kubenswrapper[4852]: I1210 12:27:25.822567 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-crjpj" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="registry-server" probeResult="failure" output=< Dec 10 12:27:25 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 12:27:25 crc kubenswrapper[4852]: > Dec 10 12:27:34 crc kubenswrapper[4852]: I1210 12:27:34.821340 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:34 crc kubenswrapper[4852]: I1210 12:27:34.877110 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:35 crc kubenswrapper[4852]: I1210 12:27:35.059815 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crjpj"] Dec 10 12:27:36 crc kubenswrapper[4852]: I1210 12:27:36.592182 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-crjpj" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="registry-server" containerID="cri-o://bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909" gracePeriod=2 Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.093811 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.269385 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llrk8\" (UniqueName: \"kubernetes.io/projected/df462bd0-70ee-4548-9946-edf2f621c14d-kube-api-access-llrk8\") pod \"df462bd0-70ee-4548-9946-edf2f621c14d\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.269461 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-utilities\") pod \"df462bd0-70ee-4548-9946-edf2f621c14d\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.269777 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-catalog-content\") pod \"df462bd0-70ee-4548-9946-edf2f621c14d\" (UID: \"df462bd0-70ee-4548-9946-edf2f621c14d\") " Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.271022 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-utilities" (OuterVolumeSpecName: "utilities") pod "df462bd0-70ee-4548-9946-edf2f621c14d" (UID: "df462bd0-70ee-4548-9946-edf2f621c14d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.276757 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df462bd0-70ee-4548-9946-edf2f621c14d-kube-api-access-llrk8" (OuterVolumeSpecName: "kube-api-access-llrk8") pod "df462bd0-70ee-4548-9946-edf2f621c14d" (UID: "df462bd0-70ee-4548-9946-edf2f621c14d"). InnerVolumeSpecName "kube-api-access-llrk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.372335 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llrk8\" (UniqueName: \"kubernetes.io/projected/df462bd0-70ee-4548-9946-edf2f621c14d-kube-api-access-llrk8\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.372378 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.395917 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df462bd0-70ee-4548-9946-edf2f621c14d" (UID: "df462bd0-70ee-4548-9946-edf2f621c14d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.474709 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df462bd0-70ee-4548-9946-edf2f621c14d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.601987 4852 generic.go:334] "Generic (PLEG): container finished" podID="df462bd0-70ee-4548-9946-edf2f621c14d" containerID="bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909" exitCode=0 Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.602026 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crjpj" event={"ID":"df462bd0-70ee-4548-9946-edf2f621c14d","Type":"ContainerDied","Data":"bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909"} Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.602052 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crjpj" event={"ID":"df462bd0-70ee-4548-9946-edf2f621c14d","Type":"ContainerDied","Data":"fba742e496b494edb95df3a516077c61e05e33271f9d11a450f3b946a3fcb05a"} Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.602049 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crjpj" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.602068 4852 scope.go:117] "RemoveContainer" containerID="bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.622023 4852 scope.go:117] "RemoveContainer" containerID="85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.638752 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crjpj"] Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.647009 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-crjpj"] Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.667183 4852 scope.go:117] "RemoveContainer" containerID="743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.687899 4852 scope.go:117] "RemoveContainer" containerID="bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909" Dec 10 12:27:37 crc kubenswrapper[4852]: E1210 12:27:37.688704 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909\": container with ID starting with bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909 not found: ID does not exist" containerID="bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.688747 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909"} err="failed to get container status \"bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909\": rpc error: code = NotFound desc = could not find container \"bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909\": container with ID starting with bd9512558637c58b8f789bd1ec20ee4bb0517d61a8574df4f082d013ab0af909 not found: ID does not exist" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.688774 4852 scope.go:117] "RemoveContainer" containerID="85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad" Dec 10 12:27:37 crc kubenswrapper[4852]: E1210 12:27:37.689159 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad\": container with ID starting with 85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad not found: ID does not exist" containerID="85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.689183 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad"} err="failed to get container status \"85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad\": rpc error: code = NotFound desc = could not find container \"85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad\": container with ID starting with 85162b23cccfd39498b5c495ffc1c2714da678c43566d3823231a947d161f5ad not found: ID does not exist" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.689199 4852 scope.go:117] "RemoveContainer" containerID="743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e" Dec 10 12:27:37 crc kubenswrapper[4852]: E1210 12:27:37.689667 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e\": container with ID starting with 743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e not found: ID does not exist" containerID="743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e" Dec 10 12:27:37 crc kubenswrapper[4852]: I1210 12:27:37.689705 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e"} err="failed to get container status \"743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e\": rpc error: code = NotFound desc = could not find container \"743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e\": container with ID starting with 743b8395fbf4cee71a6940425a61fdb1a3a47e85acdc48d946fbe5c3dc87800e not found: ID does not exist" Dec 10 12:27:38 crc kubenswrapper[4852]: I1210 12:27:38.185803 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" path="/var/lib/kubelet/pods/df462bd0-70ee-4548-9946-edf2f621c14d/volumes" Dec 10 12:27:45 crc kubenswrapper[4852]: I1210 12:27:45.790648 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:27:45 crc kubenswrapper[4852]: I1210 12:27:45.791313 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:27:57 crc kubenswrapper[4852]: I1210 12:27:57.935897 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f49k8"] Dec 10 12:27:57 crc kubenswrapper[4852]: E1210 12:27:57.937997 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="extract-content" Dec 10 12:27:57 crc kubenswrapper[4852]: I1210 12:27:57.938103 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="extract-content" Dec 10 12:27:57 crc kubenswrapper[4852]: E1210 12:27:57.938211 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="registry-server" Dec 10 12:27:57 crc kubenswrapper[4852]: I1210 12:27:57.938312 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="registry-server" Dec 10 12:27:57 crc kubenswrapper[4852]: E1210 12:27:57.938391 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="extract-utilities" Dec 10 12:27:57 crc kubenswrapper[4852]: I1210 12:27:57.938466 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="extract-utilities" Dec 10 12:27:57 crc kubenswrapper[4852]: I1210 12:27:57.938763 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="df462bd0-70ee-4548-9946-edf2f621c14d" containerName="registry-server" Dec 10 12:27:57 crc kubenswrapper[4852]: I1210 12:27:57.944764 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:57 crc kubenswrapper[4852]: I1210 12:27:57.950406 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f49k8"] Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.070529 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zb2k\" (UniqueName: \"kubernetes.io/projected/f73da503-ab14-41b7-b296-c63502ef5986-kube-api-access-9zb2k\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.070588 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-utilities\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.070987 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-catalog-content\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.181263 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-catalog-content\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.181708 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zb2k\" (UniqueName: \"kubernetes.io/projected/f73da503-ab14-41b7-b296-c63502ef5986-kube-api-access-9zb2k\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.181765 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-utilities\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.185411 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-utilities\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.185729 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-catalog-content\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.219175 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zb2k\" (UniqueName: \"kubernetes.io/projected/f73da503-ab14-41b7-b296-c63502ef5986-kube-api-access-9zb2k\") pod \"certified-operators-f49k8\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.281897 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:27:58 crc kubenswrapper[4852]: I1210 12:27:58.799422 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f49k8"] Dec 10 12:27:59 crc kubenswrapper[4852]: I1210 12:27:59.811138 4852 generic.go:334] "Generic (PLEG): container finished" podID="f73da503-ab14-41b7-b296-c63502ef5986" containerID="b3851b206fe1f5158f88e046df8934ef71126482f40185b7c1e255715fd0dc0a" exitCode=0 Dec 10 12:27:59 crc kubenswrapper[4852]: I1210 12:27:59.811351 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f49k8" event={"ID":"f73da503-ab14-41b7-b296-c63502ef5986","Type":"ContainerDied","Data":"b3851b206fe1f5158f88e046df8934ef71126482f40185b7c1e255715fd0dc0a"} Dec 10 12:27:59 crc kubenswrapper[4852]: I1210 12:27:59.811763 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f49k8" event={"ID":"f73da503-ab14-41b7-b296-c63502ef5986","Type":"ContainerStarted","Data":"b6a1ef046769b5855158b8604e7f2babcf70b1fe04091f2fd4636f6ce21ae77b"} Dec 10 12:28:00 crc kubenswrapper[4852]: I1210 12:28:00.824510 4852 generic.go:334] "Generic (PLEG): container finished" podID="40acc70f-2b91-4e6e-af47-b525289badc8" containerID="9763ecb75e1b9fab2685f05209c3e8b1d0ae67e24dec00be8b9f125c7a80132a" exitCode=0 Dec 10 12:28:00 crc kubenswrapper[4852]: I1210 12:28:00.824627 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" event={"ID":"40acc70f-2b91-4e6e-af47-b525289badc8","Type":"ContainerDied","Data":"9763ecb75e1b9fab2685f05209c3e8b1d0ae67e24dec00be8b9f125c7a80132a"} Dec 10 12:28:00 crc kubenswrapper[4852]: I1210 12:28:00.828898 4852 generic.go:334] "Generic (PLEG): container finished" podID="f73da503-ab14-41b7-b296-c63502ef5986" containerID="6eeb71492af62a227b42d9e4e5adcf0e6edfe92f722afca76444b0afa0f3a3c0" exitCode=0 Dec 10 12:28:00 crc kubenswrapper[4852]: I1210 12:28:00.828946 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f49k8" event={"ID":"f73da503-ab14-41b7-b296-c63502ef5986","Type":"ContainerDied","Data":"6eeb71492af62a227b42d9e4e5adcf0e6edfe92f722afca76444b0afa0f3a3c0"} Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.329731 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.378091 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t48nd\" (UniqueName: \"kubernetes.io/projected/40acc70f-2b91-4e6e-af47-b525289badc8-kube-api-access-t48nd\") pod \"40acc70f-2b91-4e6e-af47-b525289badc8\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.378169 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-ssh-key\") pod \"40acc70f-2b91-4e6e-af47-b525289badc8\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.378248 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-inventory\") pod \"40acc70f-2b91-4e6e-af47-b525289badc8\" (UID: \"40acc70f-2b91-4e6e-af47-b525289badc8\") " Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.383807 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40acc70f-2b91-4e6e-af47-b525289badc8-kube-api-access-t48nd" (OuterVolumeSpecName: "kube-api-access-t48nd") pod "40acc70f-2b91-4e6e-af47-b525289badc8" (UID: "40acc70f-2b91-4e6e-af47-b525289badc8"). InnerVolumeSpecName "kube-api-access-t48nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.451592 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-inventory" (OuterVolumeSpecName: "inventory") pod "40acc70f-2b91-4e6e-af47-b525289badc8" (UID: "40acc70f-2b91-4e6e-af47-b525289badc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.454042 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "40acc70f-2b91-4e6e-af47-b525289badc8" (UID: "40acc70f-2b91-4e6e-af47-b525289badc8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.479613 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t48nd\" (UniqueName: \"kubernetes.io/projected/40acc70f-2b91-4e6e-af47-b525289badc8-kube-api-access-t48nd\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.479640 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.479652 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40acc70f-2b91-4e6e-af47-b525289badc8-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.865180 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f49k8" event={"ID":"f73da503-ab14-41b7-b296-c63502ef5986","Type":"ContainerStarted","Data":"814aca210d81bda60f01a31da16a6e4e44533ef4a205cba76dc6fe7e063bc026"} Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.871667 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" event={"ID":"40acc70f-2b91-4e6e-af47-b525289badc8","Type":"ContainerDied","Data":"17c9649118306d47dd542353c78bf3d27c01f26175a51b846b5ab1f57d080ddc"} Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.871711 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17c9649118306d47dd542353c78bf3d27c01f26175a51b846b5ab1f57d080ddc" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.871810 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rh9qn" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.925886 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f49k8" podStartSLOduration=3.989521889 podStartE2EDuration="5.925862475s" podCreationTimestamp="2025-12-10 12:27:57 +0000 UTC" firstStartedPulling="2025-12-10 12:27:59.813123717 +0000 UTC m=+2165.898648941" lastFinishedPulling="2025-12-10 12:28:01.749464303 +0000 UTC m=+2167.834989527" observedRunningTime="2025-12-10 12:28:02.88770117 +0000 UTC m=+2168.973226394" watchObservedRunningTime="2025-12-10 12:28:02.925862475 +0000 UTC m=+2169.011387699" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.937556 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf"] Dec 10 12:28:02 crc kubenswrapper[4852]: E1210 12:28:02.938006 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40acc70f-2b91-4e6e-af47-b525289badc8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.938035 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="40acc70f-2b91-4e6e-af47-b525289badc8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.938350 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="40acc70f-2b91-4e6e-af47-b525289badc8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.939142 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.942934 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.943152 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.943414 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.943572 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:28:02 crc kubenswrapper[4852]: I1210 12:28:02.954556 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf"] Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.089705 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.090015 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.090040 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwpz4\" (UniqueName: \"kubernetes.io/projected/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-kube-api-access-jwpz4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.191496 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.191774 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwpz4\" (UniqueName: \"kubernetes.io/projected/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-kube-api-access-jwpz4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.191937 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.196802 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.198564 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.210627 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwpz4\" (UniqueName: \"kubernetes.io/projected/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-kube-api-access-jwpz4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.262438 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.782501 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf"] Dec 10 12:28:03 crc kubenswrapper[4852]: W1210 12:28:03.786164 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d0bf0c_a43a_47e9_bf7e_5bdad23e513e.slice/crio-7f3fefcda5c1918668954160ab510339233fbbc16bfa290624f16fc922496a22 WatchSource:0}: Error finding container 7f3fefcda5c1918668954160ab510339233fbbc16bfa290624f16fc922496a22: Status 404 returned error can't find the container with id 7f3fefcda5c1918668954160ab510339233fbbc16bfa290624f16fc922496a22 Dec 10 12:28:03 crc kubenswrapper[4852]: I1210 12:28:03.880502 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" event={"ID":"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e","Type":"ContainerStarted","Data":"7f3fefcda5c1918668954160ab510339233fbbc16bfa290624f16fc922496a22"} Dec 10 12:28:05 crc kubenswrapper[4852]: I1210 12:28:05.904367 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" event={"ID":"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e","Type":"ContainerStarted","Data":"58f61431d541918664512b7731c893ac733a69028fc3d427cbda0d45f891375d"} Dec 10 12:28:05 crc kubenswrapper[4852]: I1210 12:28:05.929570 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" podStartSLOduration=3.093152687 podStartE2EDuration="3.929542685s" podCreationTimestamp="2025-12-10 12:28:02 +0000 UTC" firstStartedPulling="2025-12-10 12:28:03.788482058 +0000 UTC m=+2169.874007282" lastFinishedPulling="2025-12-10 12:28:04.624872056 +0000 UTC m=+2170.710397280" observedRunningTime="2025-12-10 12:28:05.919850073 +0000 UTC m=+2172.005375307" watchObservedRunningTime="2025-12-10 12:28:05.929542685 +0000 UTC m=+2172.015067919" Dec 10 12:28:08 crc kubenswrapper[4852]: I1210 12:28:08.282252 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:28:08 crc kubenswrapper[4852]: I1210 12:28:08.284655 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:28:08 crc kubenswrapper[4852]: I1210 12:28:08.329773 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:28:08 crc kubenswrapper[4852]: I1210 12:28:08.980264 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:28:09 crc kubenswrapper[4852]: I1210 12:28:09.034276 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f49k8"] Dec 10 12:28:10 crc kubenswrapper[4852]: I1210 12:28:10.951179 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f49k8" podUID="f73da503-ab14-41b7-b296-c63502ef5986" containerName="registry-server" containerID="cri-o://814aca210d81bda60f01a31da16a6e4e44533ef4a205cba76dc6fe7e063bc026" gracePeriod=2 Dec 10 12:28:11 crc kubenswrapper[4852]: I1210 12:28:11.961280 4852 generic.go:334] "Generic (PLEG): container finished" podID="f73da503-ab14-41b7-b296-c63502ef5986" containerID="814aca210d81bda60f01a31da16a6e4e44533ef4a205cba76dc6fe7e063bc026" exitCode=0 Dec 10 12:28:11 crc kubenswrapper[4852]: I1210 12:28:11.961347 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f49k8" event={"ID":"f73da503-ab14-41b7-b296-c63502ef5986","Type":"ContainerDied","Data":"814aca210d81bda60f01a31da16a6e4e44533ef4a205cba76dc6fe7e063bc026"} Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.559474 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.582094 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zb2k\" (UniqueName: \"kubernetes.io/projected/f73da503-ab14-41b7-b296-c63502ef5986-kube-api-access-9zb2k\") pod \"f73da503-ab14-41b7-b296-c63502ef5986\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.582278 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-utilities\") pod \"f73da503-ab14-41b7-b296-c63502ef5986\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.582313 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-catalog-content\") pod \"f73da503-ab14-41b7-b296-c63502ef5986\" (UID: \"f73da503-ab14-41b7-b296-c63502ef5986\") " Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.583402 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-utilities" (OuterVolumeSpecName: "utilities") pod "f73da503-ab14-41b7-b296-c63502ef5986" (UID: "f73da503-ab14-41b7-b296-c63502ef5986"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.588148 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f73da503-ab14-41b7-b296-c63502ef5986-kube-api-access-9zb2k" (OuterVolumeSpecName: "kube-api-access-9zb2k") pod "f73da503-ab14-41b7-b296-c63502ef5986" (UID: "f73da503-ab14-41b7-b296-c63502ef5986"). InnerVolumeSpecName "kube-api-access-9zb2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.644001 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f73da503-ab14-41b7-b296-c63502ef5986" (UID: "f73da503-ab14-41b7-b296-c63502ef5986"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.684658 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zb2k\" (UniqueName: \"kubernetes.io/projected/f73da503-ab14-41b7-b296-c63502ef5986-kube-api-access-9zb2k\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.684690 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.684702 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f73da503-ab14-41b7-b296-c63502ef5986-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.973343 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f49k8" event={"ID":"f73da503-ab14-41b7-b296-c63502ef5986","Type":"ContainerDied","Data":"b6a1ef046769b5855158b8604e7f2babcf70b1fe04091f2fd4636f6ce21ae77b"} Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.973397 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f49k8" Dec 10 12:28:12 crc kubenswrapper[4852]: I1210 12:28:12.973409 4852 scope.go:117] "RemoveContainer" containerID="814aca210d81bda60f01a31da16a6e4e44533ef4a205cba76dc6fe7e063bc026" Dec 10 12:28:13 crc kubenswrapper[4852]: I1210 12:28:13.008862 4852 scope.go:117] "RemoveContainer" containerID="6eeb71492af62a227b42d9e4e5adcf0e6edfe92f722afca76444b0afa0f3a3c0" Dec 10 12:28:13 crc kubenswrapper[4852]: I1210 12:28:13.009012 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f49k8"] Dec 10 12:28:13 crc kubenswrapper[4852]: I1210 12:28:13.016864 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f49k8"] Dec 10 12:28:13 crc kubenswrapper[4852]: I1210 12:28:13.032492 4852 scope.go:117] "RemoveContainer" containerID="b3851b206fe1f5158f88e046df8934ef71126482f40185b7c1e255715fd0dc0a" Dec 10 12:28:14 crc kubenswrapper[4852]: I1210 12:28:14.181902 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f73da503-ab14-41b7-b296-c63502ef5986" path="/var/lib/kubelet/pods/f73da503-ab14-41b7-b296-c63502ef5986/volumes" Dec 10 12:28:15 crc kubenswrapper[4852]: I1210 12:28:15.790055 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:28:15 crc kubenswrapper[4852]: I1210 12:28:15.790421 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:28:45 crc kubenswrapper[4852]: I1210 12:28:45.790595 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:28:45 crc kubenswrapper[4852]: I1210 12:28:45.791170 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:28:45 crc kubenswrapper[4852]: I1210 12:28:45.791258 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:28:45 crc kubenswrapper[4852]: I1210 12:28:45.792210 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6352925c4a9867f9532e5b81e9bae6d86319db1a6b4361b34333b9d6c861d17"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:28:45 crc kubenswrapper[4852]: I1210 12:28:45.792347 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://f6352925c4a9867f9532e5b81e9bae6d86319db1a6b4361b34333b9d6c861d17" gracePeriod=600 Dec 10 12:28:46 crc kubenswrapper[4852]: I1210 12:28:46.551578 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="f6352925c4a9867f9532e5b81e9bae6d86319db1a6b4361b34333b9d6c861d17" exitCode=0 Dec 10 12:28:46 crc kubenswrapper[4852]: I1210 12:28:46.551646 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"f6352925c4a9867f9532e5b81e9bae6d86319db1a6b4361b34333b9d6c861d17"} Dec 10 12:28:46 crc kubenswrapper[4852]: I1210 12:28:46.552032 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd"} Dec 10 12:28:46 crc kubenswrapper[4852]: I1210 12:28:46.552059 4852 scope.go:117] "RemoveContainer" containerID="3956bea04da93b7e07d82d95a5b7b0b427cc93a57bdfb6c9d49ad75bc030e559" Dec 10 12:28:54 crc kubenswrapper[4852]: I1210 12:28:54.627362 4852 generic.go:334] "Generic (PLEG): container finished" podID="95d0bf0c-a43a-47e9-bf7e-5bdad23e513e" containerID="58f61431d541918664512b7731c893ac733a69028fc3d427cbda0d45f891375d" exitCode=0 Dec 10 12:28:54 crc kubenswrapper[4852]: I1210 12:28:54.627430 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" event={"ID":"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e","Type":"ContainerDied","Data":"58f61431d541918664512b7731c893ac733a69028fc3d427cbda0d45f891375d"} Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.057779 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.139030 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-inventory\") pod \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.139109 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwpz4\" (UniqueName: \"kubernetes.io/projected/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-kube-api-access-jwpz4\") pod \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.139193 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-ssh-key\") pod \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\" (UID: \"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e\") " Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.153210 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-kube-api-access-jwpz4" (OuterVolumeSpecName: "kube-api-access-jwpz4") pod "95d0bf0c-a43a-47e9-bf7e-5bdad23e513e" (UID: "95d0bf0c-a43a-47e9-bf7e-5bdad23e513e"). InnerVolumeSpecName "kube-api-access-jwpz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.168849 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "95d0bf0c-a43a-47e9-bf7e-5bdad23e513e" (UID: "95d0bf0c-a43a-47e9-bf7e-5bdad23e513e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.178211 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-inventory" (OuterVolumeSpecName: "inventory") pod "95d0bf0c-a43a-47e9-bf7e-5bdad23e513e" (UID: "95d0bf0c-a43a-47e9-bf7e-5bdad23e513e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.241629 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.241666 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwpz4\" (UniqueName: \"kubernetes.io/projected/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-kube-api-access-jwpz4\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.241677 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95d0bf0c-a43a-47e9-bf7e-5bdad23e513e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.652720 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" event={"ID":"95d0bf0c-a43a-47e9-bf7e-5bdad23e513e","Type":"ContainerDied","Data":"7f3fefcda5c1918668954160ab510339233fbbc16bfa290624f16fc922496a22"} Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.653092 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f3fefcda5c1918668954160ab510339233fbbc16bfa290624f16fc922496a22" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.652773 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.722898 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xxr4g"] Dec 10 12:28:56 crc kubenswrapper[4852]: E1210 12:28:56.723318 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d0bf0c-a43a-47e9-bf7e-5bdad23e513e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.723337 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d0bf0c-a43a-47e9-bf7e-5bdad23e513e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:28:56 crc kubenswrapper[4852]: E1210 12:28:56.723365 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f73da503-ab14-41b7-b296-c63502ef5986" containerName="extract-utilities" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.723372 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f73da503-ab14-41b7-b296-c63502ef5986" containerName="extract-utilities" Dec 10 12:28:56 crc kubenswrapper[4852]: E1210 12:28:56.723418 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f73da503-ab14-41b7-b296-c63502ef5986" containerName="extract-content" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.723424 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f73da503-ab14-41b7-b296-c63502ef5986" containerName="extract-content" Dec 10 12:28:56 crc kubenswrapper[4852]: E1210 12:28:56.723433 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f73da503-ab14-41b7-b296-c63502ef5986" containerName="registry-server" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.723438 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f73da503-ab14-41b7-b296-c63502ef5986" containerName="registry-server" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.723687 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f73da503-ab14-41b7-b296-c63502ef5986" containerName="registry-server" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.723706 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d0bf0c-a43a-47e9-bf7e-5bdad23e513e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.724325 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.727279 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.727854 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.727856 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.728562 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.732558 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xxr4g"] Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.851843 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.851928 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.851955 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfzp9\" (UniqueName: \"kubernetes.io/projected/bc600c67-710c-494a-9fb0-866745c0709d-kube-api-access-tfzp9\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.954009 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.954120 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.954146 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfzp9\" (UniqueName: \"kubernetes.io/projected/bc600c67-710c-494a-9fb0-866745c0709d-kube-api-access-tfzp9\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.959398 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.959559 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:56 crc kubenswrapper[4852]: I1210 12:28:56.970992 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfzp9\" (UniqueName: \"kubernetes.io/projected/bc600c67-710c-494a-9fb0-866745c0709d-kube-api-access-tfzp9\") pod \"ssh-known-hosts-edpm-deployment-xxr4g\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:57 crc kubenswrapper[4852]: I1210 12:28:57.040884 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:28:57 crc kubenswrapper[4852]: I1210 12:28:57.541454 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xxr4g"] Dec 10 12:28:57 crc kubenswrapper[4852]: I1210 12:28:57.662591 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" event={"ID":"bc600c67-710c-494a-9fb0-866745c0709d","Type":"ContainerStarted","Data":"6f6c7572da039962ab14eab5055a9ad826b0ce8ffa225c4aedd9a782e8e1195a"} Dec 10 12:29:00 crc kubenswrapper[4852]: I1210 12:29:00.690054 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" event={"ID":"bc600c67-710c-494a-9fb0-866745c0709d","Type":"ContainerStarted","Data":"3afccf6d570c203fe5062ed1c032f7f6ecb8594ab3464c6d9e2c5ff4c0d87957"} Dec 10 12:29:00 crc kubenswrapper[4852]: I1210 12:29:00.704736 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" podStartSLOduration=2.897115779 podStartE2EDuration="4.704719347s" podCreationTimestamp="2025-12-10 12:28:56 +0000 UTC" firstStartedPulling="2025-12-10 12:28:57.546043659 +0000 UTC m=+2223.631568883" lastFinishedPulling="2025-12-10 12:28:59.353647227 +0000 UTC m=+2225.439172451" observedRunningTime="2025-12-10 12:29:00.704659666 +0000 UTC m=+2226.790184890" watchObservedRunningTime="2025-12-10 12:29:00.704719347 +0000 UTC m=+2226.790244581" Dec 10 12:29:06 crc kubenswrapper[4852]: I1210 12:29:06.753693 4852 generic.go:334] "Generic (PLEG): container finished" podID="bc600c67-710c-494a-9fb0-866745c0709d" containerID="3afccf6d570c203fe5062ed1c032f7f6ecb8594ab3464c6d9e2c5ff4c0d87957" exitCode=0 Dec 10 12:29:06 crc kubenswrapper[4852]: I1210 12:29:06.753802 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" event={"ID":"bc600c67-710c-494a-9fb0-866745c0709d","Type":"ContainerDied","Data":"3afccf6d570c203fe5062ed1c032f7f6ecb8594ab3464c6d9e2c5ff4c0d87957"} Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.183139 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.264377 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfzp9\" (UniqueName: \"kubernetes.io/projected/bc600c67-710c-494a-9fb0-866745c0709d-kube-api-access-tfzp9\") pod \"bc600c67-710c-494a-9fb0-866745c0709d\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.264436 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-ssh-key-openstack-edpm-ipam\") pod \"bc600c67-710c-494a-9fb0-866745c0709d\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.264517 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-inventory-0\") pod \"bc600c67-710c-494a-9fb0-866745c0709d\" (UID: \"bc600c67-710c-494a-9fb0-866745c0709d\") " Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.270670 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc600c67-710c-494a-9fb0-866745c0709d-kube-api-access-tfzp9" (OuterVolumeSpecName: "kube-api-access-tfzp9") pod "bc600c67-710c-494a-9fb0-866745c0709d" (UID: "bc600c67-710c-494a-9fb0-866745c0709d"). InnerVolumeSpecName "kube-api-access-tfzp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.292037 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bc600c67-710c-494a-9fb0-866745c0709d" (UID: "bc600c67-710c-494a-9fb0-866745c0709d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.292169 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bc600c67-710c-494a-9fb0-866745c0709d" (UID: "bc600c67-710c-494a-9fb0-866745c0709d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.366456 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfzp9\" (UniqueName: \"kubernetes.io/projected/bc600c67-710c-494a-9fb0-866745c0709d-kube-api-access-tfzp9\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.366484 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.366496 4852 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bc600c67-710c-494a-9fb0-866745c0709d-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.771046 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" event={"ID":"bc600c67-710c-494a-9fb0-866745c0709d","Type":"ContainerDied","Data":"6f6c7572da039962ab14eab5055a9ad826b0ce8ffa225c4aedd9a782e8e1195a"} Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.771087 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f6c7572da039962ab14eab5055a9ad826b0ce8ffa225c4aedd9a782e8e1195a" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.771163 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xxr4g" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.842479 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv"] Dec 10 12:29:08 crc kubenswrapper[4852]: E1210 12:29:08.842949 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc600c67-710c-494a-9fb0-866745c0709d" containerName="ssh-known-hosts-edpm-deployment" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.842972 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc600c67-710c-494a-9fb0-866745c0709d" containerName="ssh-known-hosts-edpm-deployment" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.843190 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc600c67-710c-494a-9fb0-866745c0709d" containerName="ssh-known-hosts-edpm-deployment" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.843944 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.845910 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.846080 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.846097 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.846153 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.854689 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv"] Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.977017 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.977075 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:08 crc kubenswrapper[4852]: I1210 12:29:08.977205 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njdxd\" (UniqueName: \"kubernetes.io/projected/5d7d1222-768a-4615-8aaa-385740584e4e-kube-api-access-njdxd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.079337 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njdxd\" (UniqueName: \"kubernetes.io/projected/5d7d1222-768a-4615-8aaa-385740584e4e-kube-api-access-njdxd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.079455 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.079486 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.083996 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.085099 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.105317 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njdxd\" (UniqueName: \"kubernetes.io/projected/5d7d1222-768a-4615-8aaa-385740584e4e-kube-api-access-njdxd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nlmsv\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.166546 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.648143 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv"] Dec 10 12:29:09 crc kubenswrapper[4852]: I1210 12:29:09.781674 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" event={"ID":"5d7d1222-768a-4615-8aaa-385740584e4e","Type":"ContainerStarted","Data":"d303f787c9a11a3686b8dafffe816b52fa1e6c4c0c00f15e479e1d12eb3b87d2"} Dec 10 12:29:10 crc kubenswrapper[4852]: I1210 12:29:10.793043 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" event={"ID":"5d7d1222-768a-4615-8aaa-385740584e4e","Type":"ContainerStarted","Data":"f0be8ac4d6e505631da0dcaf8fff1e55dc123ae9538425d8f3c09b1a38645e96"} Dec 10 12:29:10 crc kubenswrapper[4852]: I1210 12:29:10.812005 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" podStartSLOduration=2.14402786 podStartE2EDuration="2.811984345s" podCreationTimestamp="2025-12-10 12:29:08 +0000 UTC" firstStartedPulling="2025-12-10 12:29:09.654892257 +0000 UTC m=+2235.740417491" lastFinishedPulling="2025-12-10 12:29:10.322848742 +0000 UTC m=+2236.408373976" observedRunningTime="2025-12-10 12:29:10.811878273 +0000 UTC m=+2236.897403517" watchObservedRunningTime="2025-12-10 12:29:10.811984345 +0000 UTC m=+2236.897509569" Dec 10 12:29:18 crc kubenswrapper[4852]: I1210 12:29:18.864038 4852 generic.go:334] "Generic (PLEG): container finished" podID="5d7d1222-768a-4615-8aaa-385740584e4e" containerID="f0be8ac4d6e505631da0dcaf8fff1e55dc123ae9538425d8f3c09b1a38645e96" exitCode=0 Dec 10 12:29:18 crc kubenswrapper[4852]: I1210 12:29:18.864139 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" event={"ID":"5d7d1222-768a-4615-8aaa-385740584e4e","Type":"ContainerDied","Data":"f0be8ac4d6e505631da0dcaf8fff1e55dc123ae9538425d8f3c09b1a38645e96"} Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.268668 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.387424 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-ssh-key\") pod \"5d7d1222-768a-4615-8aaa-385740584e4e\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.387817 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-inventory\") pod \"5d7d1222-768a-4615-8aaa-385740584e4e\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.387953 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njdxd\" (UniqueName: \"kubernetes.io/projected/5d7d1222-768a-4615-8aaa-385740584e4e-kube-api-access-njdxd\") pod \"5d7d1222-768a-4615-8aaa-385740584e4e\" (UID: \"5d7d1222-768a-4615-8aaa-385740584e4e\") " Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.394977 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d7d1222-768a-4615-8aaa-385740584e4e-kube-api-access-njdxd" (OuterVolumeSpecName: "kube-api-access-njdxd") pod "5d7d1222-768a-4615-8aaa-385740584e4e" (UID: "5d7d1222-768a-4615-8aaa-385740584e4e"). InnerVolumeSpecName "kube-api-access-njdxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.413941 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5d7d1222-768a-4615-8aaa-385740584e4e" (UID: "5d7d1222-768a-4615-8aaa-385740584e4e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.416541 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-inventory" (OuterVolumeSpecName: "inventory") pod "5d7d1222-768a-4615-8aaa-385740584e4e" (UID: "5d7d1222-768a-4615-8aaa-385740584e4e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.490401 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.490452 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d7d1222-768a-4615-8aaa-385740584e4e-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.490471 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njdxd\" (UniqueName: \"kubernetes.io/projected/5d7d1222-768a-4615-8aaa-385740584e4e-kube-api-access-njdxd\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.891127 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" event={"ID":"5d7d1222-768a-4615-8aaa-385740584e4e","Type":"ContainerDied","Data":"d303f787c9a11a3686b8dafffe816b52fa1e6c4c0c00f15e479e1d12eb3b87d2"} Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.891173 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d303f787c9a11a3686b8dafffe816b52fa1e6c4c0c00f15e479e1d12eb3b87d2" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.891274 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nlmsv" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.966345 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r"] Dec 10 12:29:20 crc kubenswrapper[4852]: E1210 12:29:20.966879 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7d1222-768a-4615-8aaa-385740584e4e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.966905 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7d1222-768a-4615-8aaa-385740584e4e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.967162 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7d1222-768a-4615-8aaa-385740584e4e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.968066 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.970159 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.970159 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.970537 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.970709 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.974592 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r"] Dec 10 12:29:20 crc kubenswrapper[4852]: I1210 12:29:20.999690 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9wpz\" (UniqueName: \"kubernetes.io/projected/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-kube-api-access-s9wpz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.000045 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.000113 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.101320 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.101398 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.101484 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9wpz\" (UniqueName: \"kubernetes.io/projected/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-kube-api-access-s9wpz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.106368 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.106949 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.119789 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9wpz\" (UniqueName: \"kubernetes.io/projected/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-kube-api-access-s9wpz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.304441 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.838470 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r"] Dec 10 12:29:21 crc kubenswrapper[4852]: W1210 12:29:21.841011 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b026ff_2591_4ac3_9ce3_51b0ab9b20d2.slice/crio-4eeeed8a1ebb7170bd48f1e564d8f48e1517d5bf832244ab2bf427b036caa8a0 WatchSource:0}: Error finding container 4eeeed8a1ebb7170bd48f1e564d8f48e1517d5bf832244ab2bf427b036caa8a0: Status 404 returned error can't find the container with id 4eeeed8a1ebb7170bd48f1e564d8f48e1517d5bf832244ab2bf427b036caa8a0 Dec 10 12:29:21 crc kubenswrapper[4852]: I1210 12:29:21.899996 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" event={"ID":"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2","Type":"ContainerStarted","Data":"4eeeed8a1ebb7170bd48f1e564d8f48e1517d5bf832244ab2bf427b036caa8a0"} Dec 10 12:29:22 crc kubenswrapper[4852]: I1210 12:29:22.910371 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" event={"ID":"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2","Type":"ContainerStarted","Data":"ffbf6ae539e6cc638570b299ac100218bf1a4572eabc31506ba21fbea81190dd"} Dec 10 12:29:22 crc kubenswrapper[4852]: I1210 12:29:22.926746 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" podStartSLOduration=2.520103139 podStartE2EDuration="2.926722309s" podCreationTimestamp="2025-12-10 12:29:20 +0000 UTC" firstStartedPulling="2025-12-10 12:29:21.843517109 +0000 UTC m=+2247.929042333" lastFinishedPulling="2025-12-10 12:29:22.250136279 +0000 UTC m=+2248.335661503" observedRunningTime="2025-12-10 12:29:22.925768865 +0000 UTC m=+2249.011294109" watchObservedRunningTime="2025-12-10 12:29:22.926722309 +0000 UTC m=+2249.012247533" Dec 10 12:29:31 crc kubenswrapper[4852]: I1210 12:29:31.996128 4852 generic.go:334] "Generic (PLEG): container finished" podID="f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2" containerID="ffbf6ae539e6cc638570b299ac100218bf1a4572eabc31506ba21fbea81190dd" exitCode=0 Dec 10 12:29:31 crc kubenswrapper[4852]: I1210 12:29:31.996185 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" event={"ID":"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2","Type":"ContainerDied","Data":"ffbf6ae539e6cc638570b299ac100218bf1a4572eabc31506ba21fbea81190dd"} Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.419443 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.537292 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-ssh-key\") pod \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.537397 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-inventory\") pod \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.537578 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9wpz\" (UniqueName: \"kubernetes.io/projected/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-kube-api-access-s9wpz\") pod \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\" (UID: \"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2\") " Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.543797 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-kube-api-access-s9wpz" (OuterVolumeSpecName: "kube-api-access-s9wpz") pod "f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2" (UID: "f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2"). InnerVolumeSpecName "kube-api-access-s9wpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.571918 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2" (UID: "f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.590067 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-inventory" (OuterVolumeSpecName: "inventory") pod "f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2" (UID: "f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.639513 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.639545 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:33 crc kubenswrapper[4852]: I1210 12:29:33.639554 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9wpz\" (UniqueName: \"kubernetes.io/projected/f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2-kube-api-access-s9wpz\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.028008 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" event={"ID":"f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2","Type":"ContainerDied","Data":"4eeeed8a1ebb7170bd48f1e564d8f48e1517d5bf832244ab2bf427b036caa8a0"} Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.028053 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eeeed8a1ebb7170bd48f1e564d8f48e1517d5bf832244ab2bf427b036caa8a0" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.028087 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.112058 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs"] Dec 10 12:29:34 crc kubenswrapper[4852]: E1210 12:29:34.112741 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.112771 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.113067 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.114077 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.126088 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.126610 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.126709 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.131432 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.131647 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.131651 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.131908 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.131944 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.139031 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs"] Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256334 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256392 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256418 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256437 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256463 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndsw\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-kube-api-access-6ndsw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256506 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256602 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256639 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256670 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256688 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256705 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256726 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256750 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.256775 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.358391 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.358477 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.359481 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.359637 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.359897 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.360123 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.360279 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.360442 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.360652 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.361435 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.361586 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.361772 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.361915 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndsw\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-kube-api-access-6ndsw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.361972 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.363675 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.364061 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.364619 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.364891 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.365433 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.365780 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.365878 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.366528 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.367416 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.367734 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.368331 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.368890 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.369061 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.382176 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndsw\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-kube-api-access-6ndsw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nshgs\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:34 crc kubenswrapper[4852]: I1210 12:29:34.479663 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:29:35 crc kubenswrapper[4852]: I1210 12:29:35.009957 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs"] Dec 10 12:29:35 crc kubenswrapper[4852]: I1210 12:29:35.036769 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" event={"ID":"ca5421d7-d674-4ead-b580-d8c63cdffb0c","Type":"ContainerStarted","Data":"cb98c15edb232545bb1a8fc75417d5a7b0592b30203c29b850c6bc47c6e60f18"} Dec 10 12:29:36 crc kubenswrapper[4852]: I1210 12:29:36.045912 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" event={"ID":"ca5421d7-d674-4ead-b580-d8c63cdffb0c","Type":"ContainerStarted","Data":"e37a14b041b5d161e96dc500bcc21ee0c8dd61e3722ad510c74bdafbbdef0c27"} Dec 10 12:29:36 crc kubenswrapper[4852]: I1210 12:29:36.068455 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" podStartSLOduration=1.444429172 podStartE2EDuration="2.068437768s" podCreationTimestamp="2025-12-10 12:29:34 +0000 UTC" firstStartedPulling="2025-12-10 12:29:35.01451223 +0000 UTC m=+2261.100037454" lastFinishedPulling="2025-12-10 12:29:35.638520826 +0000 UTC m=+2261.724046050" observedRunningTime="2025-12-10 12:29:36.061470714 +0000 UTC m=+2262.146995958" watchObservedRunningTime="2025-12-10 12:29:36.068437768 +0000 UTC m=+2262.153962992" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.567542 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-25nff"] Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.571030 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.584320 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25nff"] Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.752117 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj294\" (UniqueName: \"kubernetes.io/projected/09baf756-c544-41c1-828e-ae539d9c0bea-kube-api-access-qj294\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.752411 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-utilities\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.752626 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-catalog-content\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.854336 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-catalog-content\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.854803 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj294\" (UniqueName: \"kubernetes.io/projected/09baf756-c544-41c1-828e-ae539d9c0bea-kube-api-access-qj294\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.854845 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-utilities\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.854928 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-catalog-content\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.855301 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-utilities\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.877755 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj294\" (UniqueName: \"kubernetes.io/projected/09baf756-c544-41c1-828e-ae539d9c0bea-kube-api-access-qj294\") pod \"community-operators-25nff\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:38 crc kubenswrapper[4852]: I1210 12:29:38.888389 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:39 crc kubenswrapper[4852]: W1210 12:29:39.437629 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09baf756_c544_41c1_828e_ae539d9c0bea.slice/crio-2c296f657c856184e22924f47162796249174c996f261163cf516780e33d88e9 WatchSource:0}: Error finding container 2c296f657c856184e22924f47162796249174c996f261163cf516780e33d88e9: Status 404 returned error can't find the container with id 2c296f657c856184e22924f47162796249174c996f261163cf516780e33d88e9 Dec 10 12:29:39 crc kubenswrapper[4852]: I1210 12:29:39.439115 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25nff"] Dec 10 12:29:40 crc kubenswrapper[4852]: I1210 12:29:40.097251 4852 generic.go:334] "Generic (PLEG): container finished" podID="09baf756-c544-41c1-828e-ae539d9c0bea" containerID="2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12" exitCode=0 Dec 10 12:29:40 crc kubenswrapper[4852]: I1210 12:29:40.097319 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25nff" event={"ID":"09baf756-c544-41c1-828e-ae539d9c0bea","Type":"ContainerDied","Data":"2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12"} Dec 10 12:29:40 crc kubenswrapper[4852]: I1210 12:29:40.097506 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25nff" event={"ID":"09baf756-c544-41c1-828e-ae539d9c0bea","Type":"ContainerStarted","Data":"2c296f657c856184e22924f47162796249174c996f261163cf516780e33d88e9"} Dec 10 12:29:42 crc kubenswrapper[4852]: I1210 12:29:42.118715 4852 generic.go:334] "Generic (PLEG): container finished" podID="09baf756-c544-41c1-828e-ae539d9c0bea" containerID="683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7" exitCode=0 Dec 10 12:29:42 crc kubenswrapper[4852]: I1210 12:29:42.118829 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25nff" event={"ID":"09baf756-c544-41c1-828e-ae539d9c0bea","Type":"ContainerDied","Data":"683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7"} Dec 10 12:29:43 crc kubenswrapper[4852]: I1210 12:29:43.130830 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25nff" event={"ID":"09baf756-c544-41c1-828e-ae539d9c0bea","Type":"ContainerStarted","Data":"bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7"} Dec 10 12:29:43 crc kubenswrapper[4852]: I1210 12:29:43.161579 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-25nff" podStartSLOduration=2.637351605 podStartE2EDuration="5.161561587s" podCreationTimestamp="2025-12-10 12:29:38 +0000 UTC" firstStartedPulling="2025-12-10 12:29:40.098909351 +0000 UTC m=+2266.184434575" lastFinishedPulling="2025-12-10 12:29:42.623119333 +0000 UTC m=+2268.708644557" observedRunningTime="2025-12-10 12:29:43.153684368 +0000 UTC m=+2269.239209602" watchObservedRunningTime="2025-12-10 12:29:43.161561587 +0000 UTC m=+2269.247086821" Dec 10 12:29:48 crc kubenswrapper[4852]: I1210 12:29:48.889067 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:48 crc kubenswrapper[4852]: I1210 12:29:48.889672 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:48 crc kubenswrapper[4852]: I1210 12:29:48.944268 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:49 crc kubenswrapper[4852]: I1210 12:29:49.233944 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:49 crc kubenswrapper[4852]: I1210 12:29:49.281954 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25nff"] Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.210486 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-25nff" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" containerName="registry-server" containerID="cri-o://bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7" gracePeriod=2 Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.703516 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.785176 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-utilities\") pod \"09baf756-c544-41c1-828e-ae539d9c0bea\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.785500 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-catalog-content\") pod \"09baf756-c544-41c1-828e-ae539d9c0bea\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.785561 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj294\" (UniqueName: \"kubernetes.io/projected/09baf756-c544-41c1-828e-ae539d9c0bea-kube-api-access-qj294\") pod \"09baf756-c544-41c1-828e-ae539d9c0bea\" (UID: \"09baf756-c544-41c1-828e-ae539d9c0bea\") " Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.786324 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-utilities" (OuterVolumeSpecName: "utilities") pod "09baf756-c544-41c1-828e-ae539d9c0bea" (UID: "09baf756-c544-41c1-828e-ae539d9c0bea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.791639 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09baf756-c544-41c1-828e-ae539d9c0bea-kube-api-access-qj294" (OuterVolumeSpecName: "kube-api-access-qj294") pod "09baf756-c544-41c1-828e-ae539d9c0bea" (UID: "09baf756-c544-41c1-828e-ae539d9c0bea"). InnerVolumeSpecName "kube-api-access-qj294". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.851825 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09baf756-c544-41c1-828e-ae539d9c0bea" (UID: "09baf756-c544-41c1-828e-ae539d9c0bea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.887351 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.887380 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj294\" (UniqueName: \"kubernetes.io/projected/09baf756-c544-41c1-828e-ae539d9c0bea-kube-api-access-qj294\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:51 crc kubenswrapper[4852]: I1210 12:29:51.887391 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09baf756-c544-41c1-828e-ae539d9c0bea-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.223139 4852 generic.go:334] "Generic (PLEG): container finished" podID="09baf756-c544-41c1-828e-ae539d9c0bea" containerID="bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7" exitCode=0 Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.223201 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25nff" event={"ID":"09baf756-c544-41c1-828e-ae539d9c0bea","Type":"ContainerDied","Data":"bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7"} Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.223489 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25nff" event={"ID":"09baf756-c544-41c1-828e-ae539d9c0bea","Type":"ContainerDied","Data":"2c296f657c856184e22924f47162796249174c996f261163cf516780e33d88e9"} Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.223516 4852 scope.go:117] "RemoveContainer" containerID="bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.223253 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25nff" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.250965 4852 scope.go:117] "RemoveContainer" containerID="683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.255964 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25nff"] Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.265467 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-25nff"] Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.293769 4852 scope.go:117] "RemoveContainer" containerID="2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.353267 4852 scope.go:117] "RemoveContainer" containerID="bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7" Dec 10 12:29:52 crc kubenswrapper[4852]: E1210 12:29:52.353725 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7\": container with ID starting with bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7 not found: ID does not exist" containerID="bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.353768 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7"} err="failed to get container status \"bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7\": rpc error: code = NotFound desc = could not find container \"bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7\": container with ID starting with bd88a469d824cdabdfd37bf8a82b15612eb850a29d96e163befa0d57aa8e25f7 not found: ID does not exist" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.353794 4852 scope.go:117] "RemoveContainer" containerID="683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7" Dec 10 12:29:52 crc kubenswrapper[4852]: E1210 12:29:52.354158 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7\": container with ID starting with 683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7 not found: ID does not exist" containerID="683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.354215 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7"} err="failed to get container status \"683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7\": rpc error: code = NotFound desc = could not find container \"683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7\": container with ID starting with 683a47b464e5d00d6f47d83cce2ceeee387d44c0905de9c68c3e4c86d13b11a7 not found: ID does not exist" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.354293 4852 scope.go:117] "RemoveContainer" containerID="2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12" Dec 10 12:29:52 crc kubenswrapper[4852]: E1210 12:29:52.354636 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12\": container with ID starting with 2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12 not found: ID does not exist" containerID="2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12" Dec 10 12:29:52 crc kubenswrapper[4852]: I1210 12:29:52.354670 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12"} err="failed to get container status \"2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12\": rpc error: code = NotFound desc = could not find container \"2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12\": container with ID starting with 2f123510d770b7df1e7f82369d72ad1ff3bf28b139c5087a9b8f63dac1206b12 not found: ID does not exist" Dec 10 12:29:54 crc kubenswrapper[4852]: I1210 12:29:54.185034 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" path="/var/lib/kubelet/pods/09baf756-c544-41c1-828e-ae539d9c0bea/volumes" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.142342 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs"] Dec 10 12:30:00 crc kubenswrapper[4852]: E1210 12:30:00.143242 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" containerName="extract-utilities" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.143256 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" containerName="extract-utilities" Dec 10 12:30:00 crc kubenswrapper[4852]: E1210 12:30:00.143278 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" containerName="extract-content" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.143284 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" containerName="extract-content" Dec 10 12:30:00 crc kubenswrapper[4852]: E1210 12:30:00.143302 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" containerName="registry-server" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.143309 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" containerName="registry-server" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.143471 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="09baf756-c544-41c1-828e-ae539d9c0bea" containerName="registry-server" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.144081 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.149762 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.149888 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.153209 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs"] Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.266215 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de404464-cc0c-4fdf-b7dc-aef4812710fe-secret-volume\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.266387 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de404464-cc0c-4fdf-b7dc-aef4812710fe-config-volume\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.266499 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5mb\" (UniqueName: \"kubernetes.io/projected/de404464-cc0c-4fdf-b7dc-aef4812710fe-kube-api-access-mr5mb\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.368725 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5mb\" (UniqueName: \"kubernetes.io/projected/de404464-cc0c-4fdf-b7dc-aef4812710fe-kube-api-access-mr5mb\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.368881 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de404464-cc0c-4fdf-b7dc-aef4812710fe-secret-volume\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.369012 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de404464-cc0c-4fdf-b7dc-aef4812710fe-config-volume\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.370339 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de404464-cc0c-4fdf-b7dc-aef4812710fe-config-volume\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.377015 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de404464-cc0c-4fdf-b7dc-aef4812710fe-secret-volume\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.389184 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5mb\" (UniqueName: \"kubernetes.io/projected/de404464-cc0c-4fdf-b7dc-aef4812710fe-kube-api-access-mr5mb\") pod \"collect-profiles-29422830-hc7vs\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.472935 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:00 crc kubenswrapper[4852]: I1210 12:30:00.942933 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs"] Dec 10 12:30:01 crc kubenswrapper[4852]: I1210 12:30:01.309760 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" event={"ID":"de404464-cc0c-4fdf-b7dc-aef4812710fe","Type":"ContainerStarted","Data":"3c9d3760c88336ce00d25dcd167735f629c3fc9d0aec8a3481a732ed7524edd6"} Dec 10 12:30:01 crc kubenswrapper[4852]: I1210 12:30:01.310111 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" event={"ID":"de404464-cc0c-4fdf-b7dc-aef4812710fe","Type":"ContainerStarted","Data":"ce5569cac995f2af5aebeb21b1acaa4c63c12c535972e11bd1cc4954393809c8"} Dec 10 12:30:01 crc kubenswrapper[4852]: I1210 12:30:01.326924 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" podStartSLOduration=1.326908227 podStartE2EDuration="1.326908227s" podCreationTimestamp="2025-12-10 12:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:30:01.324973858 +0000 UTC m=+2287.410499102" watchObservedRunningTime="2025-12-10 12:30:01.326908227 +0000 UTC m=+2287.412433441" Dec 10 12:30:02 crc kubenswrapper[4852]: I1210 12:30:02.326614 4852 generic.go:334] "Generic (PLEG): container finished" podID="de404464-cc0c-4fdf-b7dc-aef4812710fe" containerID="3c9d3760c88336ce00d25dcd167735f629c3fc9d0aec8a3481a732ed7524edd6" exitCode=0 Dec 10 12:30:02 crc kubenswrapper[4852]: I1210 12:30:02.326723 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" event={"ID":"de404464-cc0c-4fdf-b7dc-aef4812710fe","Type":"ContainerDied","Data":"3c9d3760c88336ce00d25dcd167735f629c3fc9d0aec8a3481a732ed7524edd6"} Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.664506 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.836222 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de404464-cc0c-4fdf-b7dc-aef4812710fe-secret-volume\") pod \"de404464-cc0c-4fdf-b7dc-aef4812710fe\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.836741 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr5mb\" (UniqueName: \"kubernetes.io/projected/de404464-cc0c-4fdf-b7dc-aef4812710fe-kube-api-access-mr5mb\") pod \"de404464-cc0c-4fdf-b7dc-aef4812710fe\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.836773 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de404464-cc0c-4fdf-b7dc-aef4812710fe-config-volume\") pod \"de404464-cc0c-4fdf-b7dc-aef4812710fe\" (UID: \"de404464-cc0c-4fdf-b7dc-aef4812710fe\") " Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.837454 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de404464-cc0c-4fdf-b7dc-aef4812710fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "de404464-cc0c-4fdf-b7dc-aef4812710fe" (UID: "de404464-cc0c-4fdf-b7dc-aef4812710fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.842418 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de404464-cc0c-4fdf-b7dc-aef4812710fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "de404464-cc0c-4fdf-b7dc-aef4812710fe" (UID: "de404464-cc0c-4fdf-b7dc-aef4812710fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.842554 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de404464-cc0c-4fdf-b7dc-aef4812710fe-kube-api-access-mr5mb" (OuterVolumeSpecName: "kube-api-access-mr5mb") pod "de404464-cc0c-4fdf-b7dc-aef4812710fe" (UID: "de404464-cc0c-4fdf-b7dc-aef4812710fe"). InnerVolumeSpecName "kube-api-access-mr5mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.940259 4852 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de404464-cc0c-4fdf-b7dc-aef4812710fe-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.940668 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr5mb\" (UniqueName: \"kubernetes.io/projected/de404464-cc0c-4fdf-b7dc-aef4812710fe-kube-api-access-mr5mb\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:03 crc kubenswrapper[4852]: I1210 12:30:03.940771 4852 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de404464-cc0c-4fdf-b7dc-aef4812710fe-config-volume\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:04 crc kubenswrapper[4852]: I1210 12:30:04.346288 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" event={"ID":"de404464-cc0c-4fdf-b7dc-aef4812710fe","Type":"ContainerDied","Data":"ce5569cac995f2af5aebeb21b1acaa4c63c12c535972e11bd1cc4954393809c8"} Dec 10 12:30:04 crc kubenswrapper[4852]: I1210 12:30:04.346337 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce5569cac995f2af5aebeb21b1acaa4c63c12c535972e11bd1cc4954393809c8" Dec 10 12:30:04 crc kubenswrapper[4852]: I1210 12:30:04.346366 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422830-hc7vs" Dec 10 12:30:04 crc kubenswrapper[4852]: I1210 12:30:04.396098 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c"] Dec 10 12:30:04 crc kubenswrapper[4852]: I1210 12:30:04.404186 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422785-vrx7c"] Dec 10 12:30:06 crc kubenswrapper[4852]: I1210 12:30:06.183420 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d28862-df31-4d6c-af29-5fa5b49104ae" path="/var/lib/kubelet/pods/a3d28862-df31-4d6c-af29-5fa5b49104ae/volumes" Dec 10 12:30:13 crc kubenswrapper[4852]: I1210 12:30:13.422030 4852 generic.go:334] "Generic (PLEG): container finished" podID="ca5421d7-d674-4ead-b580-d8c63cdffb0c" containerID="e37a14b041b5d161e96dc500bcc21ee0c8dd61e3722ad510c74bdafbbdef0c27" exitCode=0 Dec 10 12:30:13 crc kubenswrapper[4852]: I1210 12:30:13.422144 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" event={"ID":"ca5421d7-d674-4ead-b580-d8c63cdffb0c","Type":"ContainerDied","Data":"e37a14b041b5d161e96dc500bcc21ee0c8dd61e3722ad510c74bdafbbdef0c27"} Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.883912 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889004 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889044 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-bootstrap-combined-ca-bundle\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889087 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889134 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-libvirt-combined-ca-bundle\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889152 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndsw\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-kube-api-access-6ndsw\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889192 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ovn-combined-ca-bundle\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889209 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-telemetry-combined-ca-bundle\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889252 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-nova-combined-ca-bundle\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889271 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ssh-key\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889294 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-repo-setup-combined-ca-bundle\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889326 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-neutron-metadata-combined-ca-bundle\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889347 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889375 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-inventory\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.889395 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\" (UID: \"ca5421d7-d674-4ead-b580-d8c63cdffb0c\") " Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.896155 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.899153 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.899836 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.899980 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.900041 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.900544 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.903524 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.904894 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.905026 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-kube-api-access-6ndsw" (OuterVolumeSpecName: "kube-api-access-6ndsw") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "kube-api-access-6ndsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.905048 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.905488 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.911581 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.938417 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.948762 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-inventory" (OuterVolumeSpecName: "inventory") pod "ca5421d7-d674-4ead-b580-d8c63cdffb0c" (UID: "ca5421d7-d674-4ead-b580-d8c63cdffb0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991106 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991142 4852 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991153 4852 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991165 4852 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991176 4852 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991189 4852 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991200 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ndsw\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-kube-api-access-6ndsw\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991212 4852 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991220 4852 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991244 4852 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991252 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991261 4852 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991272 4852 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5421d7-d674-4ead-b580-d8c63cdffb0c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:14 crc kubenswrapper[4852]: I1210 12:30:14.991282 4852 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca5421d7-d674-4ead-b580-d8c63cdffb0c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.445400 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" event={"ID":"ca5421d7-d674-4ead-b580-d8c63cdffb0c","Type":"ContainerDied","Data":"cb98c15edb232545bb1a8fc75417d5a7b0592b30203c29b850c6bc47c6e60f18"} Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.445833 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb98c15edb232545bb1a8fc75417d5a7b0592b30203c29b850c6bc47c6e60f18" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.445485 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nshgs" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.538487 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx"] Dec 10 12:30:15 crc kubenswrapper[4852]: E1210 12:30:15.539591 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de404464-cc0c-4fdf-b7dc-aef4812710fe" containerName="collect-profiles" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.539619 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="de404464-cc0c-4fdf-b7dc-aef4812710fe" containerName="collect-profiles" Dec 10 12:30:15 crc kubenswrapper[4852]: E1210 12:30:15.539693 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5421d7-d674-4ead-b580-d8c63cdffb0c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.539708 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5421d7-d674-4ead-b580-d8c63cdffb0c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.543594 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="de404464-cc0c-4fdf-b7dc-aef4812710fe" containerName="collect-profiles" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.543653 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5421d7-d674-4ead-b580-d8c63cdffb0c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.544768 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.550214 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.553155 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.553344 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.553434 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.553487 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.571398 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx"] Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.605343 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.605391 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l79qb\" (UniqueName: \"kubernetes.io/projected/438ab74a-135c-480f-9335-9e2f4f81c0c2-kube-api-access-l79qb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.605481 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.605519 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.605644 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.707174 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.707318 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.707348 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l79qb\" (UniqueName: \"kubernetes.io/projected/438ab74a-135c-480f-9335-9e2f4f81c0c2-kube-api-access-l79qb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.707426 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.707467 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.708362 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.712061 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.713358 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.714744 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.726766 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l79qb\" (UniqueName: \"kubernetes.io/projected/438ab74a-135c-480f-9335-9e2f4f81c0c2-kube-api-access-l79qb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9pkxx\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:15 crc kubenswrapper[4852]: I1210 12:30:15.870559 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:30:16 crc kubenswrapper[4852]: I1210 12:30:16.422448 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx"] Dec 10 12:30:16 crc kubenswrapper[4852]: I1210 12:30:16.454115 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" event={"ID":"438ab74a-135c-480f-9335-9e2f4f81c0c2","Type":"ContainerStarted","Data":"1643d0edea487a04b391a90dbd03cb399aa4a948d74f7f1eecc8d55ba20fd3ee"} Dec 10 12:30:17 crc kubenswrapper[4852]: I1210 12:30:17.463751 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" event={"ID":"438ab74a-135c-480f-9335-9e2f4f81c0c2","Type":"ContainerStarted","Data":"35c3c05ab2d30603b43478d2c18b1cf770cea3d8ff786f448abfc3ddfb9f2652"} Dec 10 12:30:17 crc kubenswrapper[4852]: I1210 12:30:17.485584 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" podStartSLOduration=1.986895742 podStartE2EDuration="2.4855621s" podCreationTimestamp="2025-12-10 12:30:15 +0000 UTC" firstStartedPulling="2025-12-10 12:30:16.427489828 +0000 UTC m=+2302.513015052" lastFinishedPulling="2025-12-10 12:30:16.926156186 +0000 UTC m=+2303.011681410" observedRunningTime="2025-12-10 12:30:17.478568393 +0000 UTC m=+2303.564093627" watchObservedRunningTime="2025-12-10 12:30:17.4855621 +0000 UTC m=+2303.571087334" Dec 10 12:31:05 crc kubenswrapper[4852]: I1210 12:31:05.614977 4852 scope.go:117] "RemoveContainer" containerID="6aadf508c61ddfe3c85a5a956f8a4ae044e4c217523b4ad73cc35db9b6dcdbe4" Dec 10 12:31:15 crc kubenswrapper[4852]: I1210 12:31:15.790383 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:31:15 crc kubenswrapper[4852]: I1210 12:31:15.791065 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:31:19 crc kubenswrapper[4852]: I1210 12:31:19.128719 4852 generic.go:334] "Generic (PLEG): container finished" podID="438ab74a-135c-480f-9335-9e2f4f81c0c2" containerID="35c3c05ab2d30603b43478d2c18b1cf770cea3d8ff786f448abfc3ddfb9f2652" exitCode=0 Dec 10 12:31:19 crc kubenswrapper[4852]: I1210 12:31:19.129212 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" event={"ID":"438ab74a-135c-480f-9335-9e2f4f81c0c2","Type":"ContainerDied","Data":"35c3c05ab2d30603b43478d2c18b1cf770cea3d8ff786f448abfc3ddfb9f2652"} Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.577767 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.627276 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l79qb\" (UniqueName: \"kubernetes.io/projected/438ab74a-135c-480f-9335-9e2f4f81c0c2-kube-api-access-l79qb\") pod \"438ab74a-135c-480f-9335-9e2f4f81c0c2\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.627330 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovn-combined-ca-bundle\") pod \"438ab74a-135c-480f-9335-9e2f4f81c0c2\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.627362 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ssh-key\") pod \"438ab74a-135c-480f-9335-9e2f4f81c0c2\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.627397 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-inventory\") pod \"438ab74a-135c-480f-9335-9e2f4f81c0c2\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.627523 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovncontroller-config-0\") pod \"438ab74a-135c-480f-9335-9e2f4f81c0c2\" (UID: \"438ab74a-135c-480f-9335-9e2f4f81c0c2\") " Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.633599 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438ab74a-135c-480f-9335-9e2f4f81c0c2-kube-api-access-l79qb" (OuterVolumeSpecName: "kube-api-access-l79qb") pod "438ab74a-135c-480f-9335-9e2f4f81c0c2" (UID: "438ab74a-135c-480f-9335-9e2f4f81c0c2"). InnerVolumeSpecName "kube-api-access-l79qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.635123 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "438ab74a-135c-480f-9335-9e2f4f81c0c2" (UID: "438ab74a-135c-480f-9335-9e2f4f81c0c2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.658214 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-inventory" (OuterVolumeSpecName: "inventory") pod "438ab74a-135c-480f-9335-9e2f4f81c0c2" (UID: "438ab74a-135c-480f-9335-9e2f4f81c0c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.660377 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "438ab74a-135c-480f-9335-9e2f4f81c0c2" (UID: "438ab74a-135c-480f-9335-9e2f4f81c0c2"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.661484 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "438ab74a-135c-480f-9335-9e2f4f81c0c2" (UID: "438ab74a-135c-480f-9335-9e2f4f81c0c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.730463 4852 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.730519 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l79qb\" (UniqueName: \"kubernetes.io/projected/438ab74a-135c-480f-9335-9e2f4f81c0c2-kube-api-access-l79qb\") on node \"crc\" DevicePath \"\"" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.730538 4852 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.730550 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:31:20 crc kubenswrapper[4852]: I1210 12:31:20.730564 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438ab74a-135c-480f-9335-9e2f4f81c0c2-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.150735 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" event={"ID":"438ab74a-135c-480f-9335-9e2f4f81c0c2","Type":"ContainerDied","Data":"1643d0edea487a04b391a90dbd03cb399aa4a948d74f7f1eecc8d55ba20fd3ee"} Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.150809 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9pkxx" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.150814 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1643d0edea487a04b391a90dbd03cb399aa4a948d74f7f1eecc8d55ba20fd3ee" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.263768 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh"] Dec 10 12:31:21 crc kubenswrapper[4852]: E1210 12:31:21.264131 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438ab74a-135c-480f-9335-9e2f4f81c0c2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.264148 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="438ab74a-135c-480f-9335-9e2f4f81c0c2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.264376 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="438ab74a-135c-480f-9335-9e2f4f81c0c2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.265099 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.267387 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.268315 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.269352 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.269637 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.269810 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.270639 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.300941 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh"] Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.339297 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.339368 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.339417 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.339518 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.339554 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.339578 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvxwz\" (UniqueName: \"kubernetes.io/projected/9d30136b-22e2-4932-9da4-836b2368d7bc-kube-api-access-gvxwz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.440304 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.440353 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvxwz\" (UniqueName: \"kubernetes.io/projected/9d30136b-22e2-4932-9da4-836b2368d7bc-kube-api-access-gvxwz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.440424 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.440474 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.440521 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.440546 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.446400 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.449158 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.449849 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.452799 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.454084 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.467266 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvxwz\" (UniqueName: \"kubernetes.io/projected/9d30136b-22e2-4932-9da4-836b2368d7bc-kube-api-access-gvxwz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:21 crc kubenswrapper[4852]: I1210 12:31:21.599760 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:31:22 crc kubenswrapper[4852]: I1210 12:31:22.162191 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:31:22 crc kubenswrapper[4852]: I1210 12:31:22.165514 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh"] Dec 10 12:31:23 crc kubenswrapper[4852]: I1210 12:31:23.171330 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" event={"ID":"9d30136b-22e2-4932-9da4-836b2368d7bc","Type":"ContainerStarted","Data":"b59834099475c9203c77c8dc394a096856ec8614fc9928daabf4124c2bd5845e"} Dec 10 12:31:23 crc kubenswrapper[4852]: I1210 12:31:23.171804 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" event={"ID":"9d30136b-22e2-4932-9da4-836b2368d7bc","Type":"ContainerStarted","Data":"4e9f94e562e6b2fd76b65eac7b3f2af4e514019e9808c6a30de23a65da7af391"} Dec 10 12:31:23 crc kubenswrapper[4852]: I1210 12:31:23.203123 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" podStartSLOduration=1.73818966 podStartE2EDuration="2.203103175s" podCreationTimestamp="2025-12-10 12:31:21 +0000 UTC" firstStartedPulling="2025-12-10 12:31:22.16194035 +0000 UTC m=+2368.247465574" lastFinishedPulling="2025-12-10 12:31:22.626853865 +0000 UTC m=+2368.712379089" observedRunningTime="2025-12-10 12:31:23.188777372 +0000 UTC m=+2369.274302636" watchObservedRunningTime="2025-12-10 12:31:23.203103175 +0000 UTC m=+2369.288628419" Dec 10 12:31:45 crc kubenswrapper[4852]: I1210 12:31:45.790488 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:31:45 crc kubenswrapper[4852]: I1210 12:31:45.791116 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:32:09 crc kubenswrapper[4852]: I1210 12:32:09.638425 4852 generic.go:334] "Generic (PLEG): container finished" podID="9d30136b-22e2-4932-9da4-836b2368d7bc" containerID="b59834099475c9203c77c8dc394a096856ec8614fc9928daabf4124c2bd5845e" exitCode=0 Dec 10 12:32:09 crc kubenswrapper[4852]: I1210 12:32:09.638646 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" event={"ID":"9d30136b-22e2-4932-9da4-836b2368d7bc","Type":"ContainerDied","Data":"b59834099475c9203c77c8dc394a096856ec8614fc9928daabf4124c2bd5845e"} Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.085988 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.276127 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9d30136b-22e2-4932-9da4-836b2368d7bc\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.276192 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-nova-metadata-neutron-config-0\") pod \"9d30136b-22e2-4932-9da4-836b2368d7bc\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.276316 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-ssh-key\") pod \"9d30136b-22e2-4932-9da4-836b2368d7bc\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.276383 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-metadata-combined-ca-bundle\") pod \"9d30136b-22e2-4932-9da4-836b2368d7bc\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.276447 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvxwz\" (UniqueName: \"kubernetes.io/projected/9d30136b-22e2-4932-9da4-836b2368d7bc-kube-api-access-gvxwz\") pod \"9d30136b-22e2-4932-9da4-836b2368d7bc\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.276496 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-inventory\") pod \"9d30136b-22e2-4932-9da4-836b2368d7bc\" (UID: \"9d30136b-22e2-4932-9da4-836b2368d7bc\") " Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.282007 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9d30136b-22e2-4932-9da4-836b2368d7bc" (UID: "9d30136b-22e2-4932-9da4-836b2368d7bc"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.282721 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d30136b-22e2-4932-9da4-836b2368d7bc-kube-api-access-gvxwz" (OuterVolumeSpecName: "kube-api-access-gvxwz") pod "9d30136b-22e2-4932-9da4-836b2368d7bc" (UID: "9d30136b-22e2-4932-9da4-836b2368d7bc"). InnerVolumeSpecName "kube-api-access-gvxwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.304510 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9d30136b-22e2-4932-9da4-836b2368d7bc" (UID: "9d30136b-22e2-4932-9da4-836b2368d7bc"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.305936 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9d30136b-22e2-4932-9da4-836b2368d7bc" (UID: "9d30136b-22e2-4932-9da4-836b2368d7bc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.307249 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9d30136b-22e2-4932-9da4-836b2368d7bc" (UID: "9d30136b-22e2-4932-9da4-836b2368d7bc"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.307553 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-inventory" (OuterVolumeSpecName: "inventory") pod "9d30136b-22e2-4932-9da4-836b2368d7bc" (UID: "9d30136b-22e2-4932-9da4-836b2368d7bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.378472 4852 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.378503 4852 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.378514 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.378524 4852 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.378534 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvxwz\" (UniqueName: \"kubernetes.io/projected/9d30136b-22e2-4932-9da4-836b2368d7bc-kube-api-access-gvxwz\") on node \"crc\" DevicePath \"\"" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.378542 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d30136b-22e2-4932-9da4-836b2368d7bc-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.660354 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" event={"ID":"9d30136b-22e2-4932-9da4-836b2368d7bc","Type":"ContainerDied","Data":"4e9f94e562e6b2fd76b65eac7b3f2af4e514019e9808c6a30de23a65da7af391"} Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.660393 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e9f94e562e6b2fd76b65eac7b3f2af4e514019e9808c6a30de23a65da7af391" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.660434 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.750871 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd"] Dec 10 12:32:11 crc kubenswrapper[4852]: E1210 12:32:11.751386 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d30136b-22e2-4932-9da4-836b2368d7bc" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.751409 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d30136b-22e2-4932-9da4-836b2368d7bc" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.751587 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d30136b-22e2-4932-9da4-836b2368d7bc" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.752302 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.754192 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.754774 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.755059 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.755220 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.757026 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.778073 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd"] Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.890670 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.890830 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.890859 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.890933 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g986\" (UniqueName: \"kubernetes.io/projected/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-kube-api-access-6g986\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.891089 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.993276 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.993323 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.993341 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g986\" (UniqueName: \"kubernetes.io/projected/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-kube-api-access-6g986\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.993381 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.993444 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.997275 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.997543 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:11 crc kubenswrapper[4852]: I1210 12:32:11.998162 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:12 crc kubenswrapper[4852]: I1210 12:32:12.004138 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:12 crc kubenswrapper[4852]: I1210 12:32:12.011163 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g986\" (UniqueName: \"kubernetes.io/projected/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-kube-api-access-6g986\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:12 crc kubenswrapper[4852]: I1210 12:32:12.083575 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:32:12 crc kubenswrapper[4852]: I1210 12:32:12.577545 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd"] Dec 10 12:32:12 crc kubenswrapper[4852]: I1210 12:32:12.674422 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" event={"ID":"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f","Type":"ContainerStarted","Data":"1f24931b3da2cea5d180709e8f6e61f7a1baa7d4fca4d32a2f1b02a829d478ac"} Dec 10 12:32:13 crc kubenswrapper[4852]: I1210 12:32:13.687405 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" event={"ID":"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f","Type":"ContainerStarted","Data":"47fd36a0e69d30f16e40e6726a8617a46596013a6ad15dbde5c11286f1282447"} Dec 10 12:32:13 crc kubenswrapper[4852]: I1210 12:32:13.707059 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" podStartSLOduration=2.256701337 podStartE2EDuration="2.707041256s" podCreationTimestamp="2025-12-10 12:32:11 +0000 UTC" firstStartedPulling="2025-12-10 12:32:12.579305612 +0000 UTC m=+2418.664830836" lastFinishedPulling="2025-12-10 12:32:13.029645531 +0000 UTC m=+2419.115170755" observedRunningTime="2025-12-10 12:32:13.70438642 +0000 UTC m=+2419.789911654" watchObservedRunningTime="2025-12-10 12:32:13.707041256 +0000 UTC m=+2419.792566480" Dec 10 12:32:15 crc kubenswrapper[4852]: I1210 12:32:15.790767 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:32:15 crc kubenswrapper[4852]: I1210 12:32:15.791058 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:32:15 crc kubenswrapper[4852]: I1210 12:32:15.791109 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:32:15 crc kubenswrapper[4852]: I1210 12:32:15.791964 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:32:15 crc kubenswrapper[4852]: I1210 12:32:15.792033 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" gracePeriod=600 Dec 10 12:32:15 crc kubenswrapper[4852]: E1210 12:32:15.912440 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:32:16 crc kubenswrapper[4852]: I1210 12:32:16.714330 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" exitCode=0 Dec 10 12:32:16 crc kubenswrapper[4852]: I1210 12:32:16.714396 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd"} Dec 10 12:32:16 crc kubenswrapper[4852]: I1210 12:32:16.714502 4852 scope.go:117] "RemoveContainer" containerID="f6352925c4a9867f9532e5b81e9bae6d86319db1a6b4361b34333b9d6c861d17" Dec 10 12:32:16 crc kubenswrapper[4852]: I1210 12:32:16.715474 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:32:16 crc kubenswrapper[4852]: E1210 12:32:16.716311 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:32:27 crc kubenswrapper[4852]: I1210 12:32:27.169801 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:32:27 crc kubenswrapper[4852]: E1210 12:32:27.170625 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:32:40 crc kubenswrapper[4852]: I1210 12:32:40.170105 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:32:40 crc kubenswrapper[4852]: E1210 12:32:40.170960 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:32:52 crc kubenswrapper[4852]: I1210 12:32:52.169815 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:32:52 crc kubenswrapper[4852]: E1210 12:32:52.170727 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:33:03 crc kubenswrapper[4852]: I1210 12:33:03.170064 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:33:03 crc kubenswrapper[4852]: E1210 12:33:03.170844 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:33:14 crc kubenswrapper[4852]: I1210 12:33:14.175304 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:33:14 crc kubenswrapper[4852]: E1210 12:33:14.176058 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:33:28 crc kubenswrapper[4852]: I1210 12:33:28.170614 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:33:28 crc kubenswrapper[4852]: E1210 12:33:28.171666 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:33:42 crc kubenswrapper[4852]: I1210 12:33:42.173858 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:33:42 crc kubenswrapper[4852]: E1210 12:33:42.175059 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:33:54 crc kubenswrapper[4852]: I1210 12:33:54.182916 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:33:54 crc kubenswrapper[4852]: E1210 12:33:54.185484 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:34:06 crc kubenswrapper[4852]: I1210 12:34:06.169382 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:34:06 crc kubenswrapper[4852]: E1210 12:34:06.170054 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:34:21 crc kubenswrapper[4852]: I1210 12:34:21.177369 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:34:21 crc kubenswrapper[4852]: E1210 12:34:21.179376 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:34:36 crc kubenswrapper[4852]: I1210 12:34:36.170613 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:34:36 crc kubenswrapper[4852]: E1210 12:34:36.171368 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:34:51 crc kubenswrapper[4852]: I1210 12:34:51.170311 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:34:51 crc kubenswrapper[4852]: E1210 12:34:51.171419 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:35:04 crc kubenswrapper[4852]: I1210 12:35:04.176681 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:35:04 crc kubenswrapper[4852]: E1210 12:35:04.177694 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:35:18 crc kubenswrapper[4852]: I1210 12:35:18.169964 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:35:18 crc kubenswrapper[4852]: E1210 12:35:18.170794 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:35:29 crc kubenswrapper[4852]: I1210 12:35:29.170763 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:35:29 crc kubenswrapper[4852]: E1210 12:35:29.172011 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:35:43 crc kubenswrapper[4852]: I1210 12:35:43.170061 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:35:43 crc kubenswrapper[4852]: E1210 12:35:43.170945 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:35:55 crc kubenswrapper[4852]: I1210 12:35:55.170272 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:35:55 crc kubenswrapper[4852]: E1210 12:35:55.170971 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:36:07 crc kubenswrapper[4852]: I1210 12:36:07.170222 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:36:07 crc kubenswrapper[4852]: E1210 12:36:07.171111 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:36:21 crc kubenswrapper[4852]: I1210 12:36:21.170220 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:36:21 crc kubenswrapper[4852]: E1210 12:36:21.171097 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:36:24 crc kubenswrapper[4852]: I1210 12:36:24.043886 4852 generic.go:334] "Generic (PLEG): container finished" podID="2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" containerID="47fd36a0e69d30f16e40e6726a8617a46596013a6ad15dbde5c11286f1282447" exitCode=0 Dec 10 12:36:24 crc kubenswrapper[4852]: I1210 12:36:24.043950 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" event={"ID":"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f","Type":"ContainerDied","Data":"47fd36a0e69d30f16e40e6726a8617a46596013a6ad15dbde5c11286f1282447"} Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.454572 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.625187 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-inventory\") pod \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.625257 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-secret-0\") pod \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.625377 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-ssh-key\") pod \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.625427 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g986\" (UniqueName: \"kubernetes.io/projected/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-kube-api-access-6g986\") pod \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.625526 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-combined-ca-bundle\") pod \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\" (UID: \"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f\") " Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.633406 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" (UID: "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.633457 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-kube-api-access-6g986" (OuterVolumeSpecName: "kube-api-access-6g986") pod "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" (UID: "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f"). InnerVolumeSpecName "kube-api-access-6g986". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.657034 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" (UID: "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.662892 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-inventory" (OuterVolumeSpecName: "inventory") pod "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" (UID: "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.673102 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" (UID: "2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.726976 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.726999 4852 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.727010 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.727018 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g986\" (UniqueName: \"kubernetes.io/projected/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-kube-api-access-6g986\") on node \"crc\" DevicePath \"\"" Dec 10 12:36:25 crc kubenswrapper[4852]: I1210 12:36:25.727027 4852 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.061750 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" event={"ID":"2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f","Type":"ContainerDied","Data":"1f24931b3da2cea5d180709e8f6e61f7a1baa7d4fca4d32a2f1b02a829d478ac"} Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.061796 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f24931b3da2cea5d180709e8f6e61f7a1baa7d4fca4d32a2f1b02a829d478ac" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.062310 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.181072 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh"] Dec 10 12:36:26 crc kubenswrapper[4852]: E1210 12:36:26.181402 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.181418 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.181629 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.182282 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.183970 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.184186 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.185706 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.185755 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.185805 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.185839 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.185941 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.188375 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh"] Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338201 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338312 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338373 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338408 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338443 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338472 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338550 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r64dl\" (UniqueName: \"kubernetes.io/projected/4d0aea88-1cca-4e75-bc26-15c9f44d8682-kube-api-access-r64dl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338589 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.338636 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.440385 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.440470 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.440534 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.440706 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.441100 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r64dl\" (UniqueName: \"kubernetes.io/projected/4d0aea88-1cca-4e75-bc26-15c9f44d8682-kube-api-access-r64dl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.441180 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.441275 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.441408 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.441537 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.443020 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.444316 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.444985 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.445546 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.451858 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.452120 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.452366 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.453063 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.467105 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r64dl\" (UniqueName: \"kubernetes.io/projected/4d0aea88-1cca-4e75-bc26-15c9f44d8682-kube-api-access-r64dl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-bjrzh\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:26 crc kubenswrapper[4852]: I1210 12:36:26.500775 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:36:27 crc kubenswrapper[4852]: I1210 12:36:27.028417 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh"] Dec 10 12:36:27 crc kubenswrapper[4852]: I1210 12:36:27.034426 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:36:27 crc kubenswrapper[4852]: I1210 12:36:27.069284 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" event={"ID":"4d0aea88-1cca-4e75-bc26-15c9f44d8682","Type":"ContainerStarted","Data":"fa42a224a49085fc6a3d826d8da6a2f320819827c93bcb33b62f6b0950e4f04b"} Dec 10 12:36:28 crc kubenswrapper[4852]: I1210 12:36:28.083934 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" event={"ID":"4d0aea88-1cca-4e75-bc26-15c9f44d8682","Type":"ContainerStarted","Data":"ade6fb426f686daa073a2f7a69d61db6387d2c9c9d9994965464463c8e21d133"} Dec 10 12:36:28 crc kubenswrapper[4852]: I1210 12:36:28.106638 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" podStartSLOduration=1.645331734 podStartE2EDuration="2.106621745s" podCreationTimestamp="2025-12-10 12:36:26 +0000 UTC" firstStartedPulling="2025-12-10 12:36:27.03414369 +0000 UTC m=+2673.119668914" lastFinishedPulling="2025-12-10 12:36:27.495433701 +0000 UTC m=+2673.580958925" observedRunningTime="2025-12-10 12:36:28.104621625 +0000 UTC m=+2674.190146859" watchObservedRunningTime="2025-12-10 12:36:28.106621745 +0000 UTC m=+2674.192146969" Dec 10 12:36:35 crc kubenswrapper[4852]: I1210 12:36:35.169664 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:36:35 crc kubenswrapper[4852]: E1210 12:36:35.170644 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:36:48 crc kubenswrapper[4852]: I1210 12:36:48.170277 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:36:48 crc kubenswrapper[4852]: E1210 12:36:48.170982 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:37:03 crc kubenswrapper[4852]: I1210 12:37:03.169817 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:37:03 crc kubenswrapper[4852]: E1210 12:37:03.170970 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:37:18 crc kubenswrapper[4852]: I1210 12:37:18.170063 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:37:18 crc kubenswrapper[4852]: I1210 12:37:18.552078 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"e9d8e5eeae465e987e4c229fef97a1d891d93af600c6d4356b1fceb18f67fed4"} Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.373078 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gh7f5"] Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.376031 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.388715 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh7f5"] Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.553333 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clrjq\" (UniqueName: \"kubernetes.io/projected/867c1192-3a4b-4621-8a3b-be95629e6b21-kube-api-access-clrjq\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.553436 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-utilities\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.553943 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-catalog-content\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.570690 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2tm8w"] Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.573070 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.581844 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2tm8w"] Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.711151 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-utilities\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.712213 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clrjq\" (UniqueName: \"kubernetes.io/projected/867c1192-3a4b-4621-8a3b-be95629e6b21-kube-api-access-clrjq\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.712335 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-catalog-content\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.712365 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k85k\" (UniqueName: \"kubernetes.io/projected/c79ba487-8000-4406-8b79-d970f854757b-kube-api-access-5k85k\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.712388 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-utilities\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.712422 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-catalog-content\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.713048 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-catalog-content\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.713159 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-utilities\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.752325 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clrjq\" (UniqueName: \"kubernetes.io/projected/867c1192-3a4b-4621-8a3b-be95629e6b21-kube-api-access-clrjq\") pod \"redhat-marketplace-gh7f5\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.816477 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-catalog-content\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.816528 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k85k\" (UniqueName: \"kubernetes.io/projected/c79ba487-8000-4406-8b79-d970f854757b-kube-api-access-5k85k\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.816564 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-utilities\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.817063 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-utilities\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.817398 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-catalog-content\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:45 crc kubenswrapper[4852]: I1210 12:37:45.842902 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k85k\" (UniqueName: \"kubernetes.io/projected/c79ba487-8000-4406-8b79-d970f854757b-kube-api-access-5k85k\") pod \"redhat-operators-2tm8w\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:46 crc kubenswrapper[4852]: I1210 12:37:46.011689 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:46 crc kubenswrapper[4852]: I1210 12:37:46.033992 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:46 crc kubenswrapper[4852]: I1210 12:37:46.595864 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh7f5"] Dec 10 12:37:46 crc kubenswrapper[4852]: I1210 12:37:46.605620 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2tm8w"] Dec 10 12:37:46 crc kubenswrapper[4852]: I1210 12:37:46.816553 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh7f5" event={"ID":"867c1192-3a4b-4621-8a3b-be95629e6b21","Type":"ContainerStarted","Data":"11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a"} Dec 10 12:37:46 crc kubenswrapper[4852]: I1210 12:37:46.816608 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh7f5" event={"ID":"867c1192-3a4b-4621-8a3b-be95629e6b21","Type":"ContainerStarted","Data":"f89a797576bf325721a15d3a48d7c71e3d27590f234dbba89ecae6365288c5de"} Dec 10 12:37:46 crc kubenswrapper[4852]: I1210 12:37:46.820646 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tm8w" event={"ID":"c79ba487-8000-4406-8b79-d970f854757b","Type":"ContainerStarted","Data":"19bfe8c0ce585437fe30589eaac9e477b997f11867bb4e2500381902d506206a"} Dec 10 12:37:47 crc kubenswrapper[4852]: I1210 12:37:47.835820 4852 generic.go:334] "Generic (PLEG): container finished" podID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerID="11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a" exitCode=0 Dec 10 12:37:47 crc kubenswrapper[4852]: I1210 12:37:47.835914 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh7f5" event={"ID":"867c1192-3a4b-4621-8a3b-be95629e6b21","Type":"ContainerDied","Data":"11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a"} Dec 10 12:37:47 crc kubenswrapper[4852]: I1210 12:37:47.840434 4852 generic.go:334] "Generic (PLEG): container finished" podID="c79ba487-8000-4406-8b79-d970f854757b" containerID="3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e" exitCode=0 Dec 10 12:37:47 crc kubenswrapper[4852]: I1210 12:37:47.840483 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tm8w" event={"ID":"c79ba487-8000-4406-8b79-d970f854757b","Type":"ContainerDied","Data":"3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e"} Dec 10 12:37:49 crc kubenswrapper[4852]: I1210 12:37:49.864845 4852 generic.go:334] "Generic (PLEG): container finished" podID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerID="c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382" exitCode=0 Dec 10 12:37:49 crc kubenswrapper[4852]: I1210 12:37:49.865149 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh7f5" event={"ID":"867c1192-3a4b-4621-8a3b-be95629e6b21","Type":"ContainerDied","Data":"c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382"} Dec 10 12:37:49 crc kubenswrapper[4852]: I1210 12:37:49.880317 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tm8w" event={"ID":"c79ba487-8000-4406-8b79-d970f854757b","Type":"ContainerStarted","Data":"5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a"} Dec 10 12:37:51 crc kubenswrapper[4852]: I1210 12:37:51.900941 4852 generic.go:334] "Generic (PLEG): container finished" podID="c79ba487-8000-4406-8b79-d970f854757b" containerID="5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a" exitCode=0 Dec 10 12:37:51 crc kubenswrapper[4852]: I1210 12:37:51.901050 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tm8w" event={"ID":"c79ba487-8000-4406-8b79-d970f854757b","Type":"ContainerDied","Data":"5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a"} Dec 10 12:37:51 crc kubenswrapper[4852]: I1210 12:37:51.907587 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh7f5" event={"ID":"867c1192-3a4b-4621-8a3b-be95629e6b21","Type":"ContainerStarted","Data":"02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd"} Dec 10 12:37:51 crc kubenswrapper[4852]: I1210 12:37:51.941725 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gh7f5" podStartSLOduration=4.244719486 podStartE2EDuration="6.94170712s" podCreationTimestamp="2025-12-10 12:37:45 +0000 UTC" firstStartedPulling="2025-12-10 12:37:47.839192262 +0000 UTC m=+2753.924717496" lastFinishedPulling="2025-12-10 12:37:50.536179906 +0000 UTC m=+2756.621705130" observedRunningTime="2025-12-10 12:37:51.935593447 +0000 UTC m=+2758.021118671" watchObservedRunningTime="2025-12-10 12:37:51.94170712 +0000 UTC m=+2758.027232344" Dec 10 12:37:54 crc kubenswrapper[4852]: I1210 12:37:54.941878 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tm8w" event={"ID":"c79ba487-8000-4406-8b79-d970f854757b","Type":"ContainerStarted","Data":"9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3"} Dec 10 12:37:54 crc kubenswrapper[4852]: I1210 12:37:54.964579 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2tm8w" podStartSLOduration=3.448223483 podStartE2EDuration="9.964553592s" podCreationTimestamp="2025-12-10 12:37:45 +0000 UTC" firstStartedPulling="2025-12-10 12:37:47.843324396 +0000 UTC m=+2753.928849660" lastFinishedPulling="2025-12-10 12:37:54.359654555 +0000 UTC m=+2760.445179769" observedRunningTime="2025-12-10 12:37:54.962349877 +0000 UTC m=+2761.047875181" watchObservedRunningTime="2025-12-10 12:37:54.964553592 +0000 UTC m=+2761.050078836" Dec 10 12:37:56 crc kubenswrapper[4852]: I1210 12:37:56.012999 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:56 crc kubenswrapper[4852]: I1210 12:37:56.013057 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:56 crc kubenswrapper[4852]: I1210 12:37:56.035509 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:56 crc kubenswrapper[4852]: I1210 12:37:56.035561 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:37:56 crc kubenswrapper[4852]: I1210 12:37:56.068726 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:57 crc kubenswrapper[4852]: I1210 12:37:57.027576 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:57 crc kubenswrapper[4852]: I1210 12:37:57.103372 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2tm8w" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="registry-server" probeResult="failure" output=< Dec 10 12:37:57 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 12:37:57 crc kubenswrapper[4852]: > Dec 10 12:37:58 crc kubenswrapper[4852]: I1210 12:37:58.159659 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh7f5"] Dec 10 12:37:58 crc kubenswrapper[4852]: I1210 12:37:58.978973 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gh7f5" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerName="registry-server" containerID="cri-o://02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd" gracePeriod=2 Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.519682 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.632264 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-catalog-content\") pod \"867c1192-3a4b-4621-8a3b-be95629e6b21\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.632450 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-utilities\") pod \"867c1192-3a4b-4621-8a3b-be95629e6b21\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.632479 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clrjq\" (UniqueName: \"kubernetes.io/projected/867c1192-3a4b-4621-8a3b-be95629e6b21-kube-api-access-clrjq\") pod \"867c1192-3a4b-4621-8a3b-be95629e6b21\" (UID: \"867c1192-3a4b-4621-8a3b-be95629e6b21\") " Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.633342 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-utilities" (OuterVolumeSpecName: "utilities") pod "867c1192-3a4b-4621-8a3b-be95629e6b21" (UID: "867c1192-3a4b-4621-8a3b-be95629e6b21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.639329 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867c1192-3a4b-4621-8a3b-be95629e6b21-kube-api-access-clrjq" (OuterVolumeSpecName: "kube-api-access-clrjq") pod "867c1192-3a4b-4621-8a3b-be95629e6b21" (UID: "867c1192-3a4b-4621-8a3b-be95629e6b21"). InnerVolumeSpecName "kube-api-access-clrjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.651512 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "867c1192-3a4b-4621-8a3b-be95629e6b21" (UID: "867c1192-3a4b-4621-8a3b-be95629e6b21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.734298 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.734332 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/867c1192-3a4b-4621-8a3b-be95629e6b21-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.734344 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clrjq\" (UniqueName: \"kubernetes.io/projected/867c1192-3a4b-4621-8a3b-be95629e6b21-kube-api-access-clrjq\") on node \"crc\" DevicePath \"\"" Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.992196 4852 generic.go:334] "Generic (PLEG): container finished" podID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerID="02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd" exitCode=0 Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.992282 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh7f5" event={"ID":"867c1192-3a4b-4621-8a3b-be95629e6b21","Type":"ContainerDied","Data":"02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd"} Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.992313 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh7f5" event={"ID":"867c1192-3a4b-4621-8a3b-be95629e6b21","Type":"ContainerDied","Data":"f89a797576bf325721a15d3a48d7c71e3d27590f234dbba89ecae6365288c5de"} Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.992334 4852 scope.go:117] "RemoveContainer" containerID="02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd" Dec 10 12:37:59 crc kubenswrapper[4852]: I1210 12:37:59.992469 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh7f5" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.014773 4852 scope.go:117] "RemoveContainer" containerID="c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.040843 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh7f5"] Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.050410 4852 scope.go:117] "RemoveContainer" containerID="11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.053286 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh7f5"] Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.111814 4852 scope.go:117] "RemoveContainer" containerID="02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd" Dec 10 12:38:00 crc kubenswrapper[4852]: E1210 12:38:00.112245 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd\": container with ID starting with 02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd not found: ID does not exist" containerID="02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.112279 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd"} err="failed to get container status \"02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd\": rpc error: code = NotFound desc = could not find container \"02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd\": container with ID starting with 02a79201ec8d16cceedbf3045c03faea16d1a5f936afe8c0fd5078f30d498bbd not found: ID does not exist" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.112300 4852 scope.go:117] "RemoveContainer" containerID="c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382" Dec 10 12:38:00 crc kubenswrapper[4852]: E1210 12:38:00.112541 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382\": container with ID starting with c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382 not found: ID does not exist" containerID="c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.112566 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382"} err="failed to get container status \"c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382\": rpc error: code = NotFound desc = could not find container \"c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382\": container with ID starting with c928fa94df77362c31c358942781ae43db4e242ca455d33cc63115a146298382 not found: ID does not exist" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.112582 4852 scope.go:117] "RemoveContainer" containerID="11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a" Dec 10 12:38:00 crc kubenswrapper[4852]: E1210 12:38:00.112818 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a\": container with ID starting with 11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a not found: ID does not exist" containerID="11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.112836 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a"} err="failed to get container status \"11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a\": rpc error: code = NotFound desc = could not find container \"11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a\": container with ID starting with 11b17089a6ee58cae1c284814ce9cdd09ff22ce08901d1734b5123f00df9cd6a not found: ID does not exist" Dec 10 12:38:00 crc kubenswrapper[4852]: I1210 12:38:00.181029 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" path="/var/lib/kubelet/pods/867c1192-3a4b-4621-8a3b-be95629e6b21/volumes" Dec 10 12:38:06 crc kubenswrapper[4852]: I1210 12:38:06.086365 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:38:06 crc kubenswrapper[4852]: I1210 12:38:06.147497 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:38:06 crc kubenswrapper[4852]: I1210 12:38:06.324417 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2tm8w"] Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.077131 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2tm8w" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="registry-server" containerID="cri-o://9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3" gracePeriod=2 Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.685627 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.821335 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-catalog-content\") pod \"c79ba487-8000-4406-8b79-d970f854757b\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.821383 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k85k\" (UniqueName: \"kubernetes.io/projected/c79ba487-8000-4406-8b79-d970f854757b-kube-api-access-5k85k\") pod \"c79ba487-8000-4406-8b79-d970f854757b\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.821527 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-utilities\") pod \"c79ba487-8000-4406-8b79-d970f854757b\" (UID: \"c79ba487-8000-4406-8b79-d970f854757b\") " Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.822420 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-utilities" (OuterVolumeSpecName: "utilities") pod "c79ba487-8000-4406-8b79-d970f854757b" (UID: "c79ba487-8000-4406-8b79-d970f854757b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.827407 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79ba487-8000-4406-8b79-d970f854757b-kube-api-access-5k85k" (OuterVolumeSpecName: "kube-api-access-5k85k") pod "c79ba487-8000-4406-8b79-d970f854757b" (UID: "c79ba487-8000-4406-8b79-d970f854757b"). InnerVolumeSpecName "kube-api-access-5k85k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.924591 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k85k\" (UniqueName: \"kubernetes.io/projected/c79ba487-8000-4406-8b79-d970f854757b-kube-api-access-5k85k\") on node \"crc\" DevicePath \"\"" Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.924859 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:38:08 crc kubenswrapper[4852]: I1210 12:38:08.935624 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c79ba487-8000-4406-8b79-d970f854757b" (UID: "c79ba487-8000-4406-8b79-d970f854757b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.026964 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79ba487-8000-4406-8b79-d970f854757b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.088792 4852 generic.go:334] "Generic (PLEG): container finished" podID="c79ba487-8000-4406-8b79-d970f854757b" containerID="9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3" exitCode=0 Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.088875 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tm8w" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.088868 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tm8w" event={"ID":"c79ba487-8000-4406-8b79-d970f854757b","Type":"ContainerDied","Data":"9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3"} Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.089015 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tm8w" event={"ID":"c79ba487-8000-4406-8b79-d970f854757b","Type":"ContainerDied","Data":"19bfe8c0ce585437fe30589eaac9e477b997f11867bb4e2500381902d506206a"} Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.089050 4852 scope.go:117] "RemoveContainer" containerID="9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.116825 4852 scope.go:117] "RemoveContainer" containerID="5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.129583 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2tm8w"] Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.145601 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2tm8w"] Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.147642 4852 scope.go:117] "RemoveContainer" containerID="3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.178014 4852 scope.go:117] "RemoveContainer" containerID="9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3" Dec 10 12:38:09 crc kubenswrapper[4852]: E1210 12:38:09.178558 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3\": container with ID starting with 9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3 not found: ID does not exist" containerID="9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.178636 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3"} err="failed to get container status \"9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3\": rpc error: code = NotFound desc = could not find container \"9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3\": container with ID starting with 9f6a3153d352b6404f4caa8bc1dfc3e484404781ea5791b8d4e8ba94ee5d36f3 not found: ID does not exist" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.178682 4852 scope.go:117] "RemoveContainer" containerID="5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a" Dec 10 12:38:09 crc kubenswrapper[4852]: E1210 12:38:09.179053 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a\": container with ID starting with 5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a not found: ID does not exist" containerID="5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.179107 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a"} err="failed to get container status \"5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a\": rpc error: code = NotFound desc = could not find container \"5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a\": container with ID starting with 5d790e620c42a79e2755fcf7d64abd42adca4641a871ebad7c809bd2f54cbc7a not found: ID does not exist" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.179138 4852 scope.go:117] "RemoveContainer" containerID="3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e" Dec 10 12:38:09 crc kubenswrapper[4852]: E1210 12:38:09.179538 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e\": container with ID starting with 3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e not found: ID does not exist" containerID="3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e" Dec 10 12:38:09 crc kubenswrapper[4852]: I1210 12:38:09.179602 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e"} err="failed to get container status \"3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e\": rpc error: code = NotFound desc = could not find container \"3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e\": container with ID starting with 3255de67a2b689dd307ded22b322e0043b4db2d1c342046396cf70c9c6a1656e not found: ID does not exist" Dec 10 12:38:10 crc kubenswrapper[4852]: I1210 12:38:10.183863 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79ba487-8000-4406-8b79-d970f854757b" path="/var/lib/kubelet/pods/c79ba487-8000-4406-8b79-d970f854757b/volumes" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.502732 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9xxq9"] Dec 10 12:38:35 crc kubenswrapper[4852]: E1210 12:38:35.504518 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerName="extract-utilities" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.504541 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerName="extract-utilities" Dec 10 12:38:35 crc kubenswrapper[4852]: E1210 12:38:35.504593 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerName="extract-content" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.504604 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerName="extract-content" Dec 10 12:38:35 crc kubenswrapper[4852]: E1210 12:38:35.504639 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerName="registry-server" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.504648 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerName="registry-server" Dec 10 12:38:35 crc kubenswrapper[4852]: E1210 12:38:35.504665 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="extract-utilities" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.504675 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="extract-utilities" Dec 10 12:38:35 crc kubenswrapper[4852]: E1210 12:38:35.504687 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="registry-server" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.504696 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="registry-server" Dec 10 12:38:35 crc kubenswrapper[4852]: E1210 12:38:35.504728 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="extract-content" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.504736 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="extract-content" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.505072 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="867c1192-3a4b-4621-8a3b-be95629e6b21" containerName="registry-server" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.505112 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79ba487-8000-4406-8b79-d970f854757b" containerName="registry-server" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.511376 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.516217 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-utilities\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.516333 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-catalog-content\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.516604 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqq4\" (UniqueName: \"kubernetes.io/projected/5ec9d108-5532-402a-89fe-3baa753e38f5-kube-api-access-4lqq4\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.525782 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9xxq9"] Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.619886 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-utilities\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.620352 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-utilities\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.620699 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-catalog-content\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.620995 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqq4\" (UniqueName: \"kubernetes.io/projected/5ec9d108-5532-402a-89fe-3baa753e38f5-kube-api-access-4lqq4\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.620999 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-catalog-content\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.641363 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqq4\" (UniqueName: \"kubernetes.io/projected/5ec9d108-5532-402a-89fe-3baa753e38f5-kube-api-access-4lqq4\") pod \"certified-operators-9xxq9\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:35 crc kubenswrapper[4852]: I1210 12:38:35.849208 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:36 crc kubenswrapper[4852]: I1210 12:38:36.410002 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9xxq9"] Dec 10 12:38:37 crc kubenswrapper[4852]: I1210 12:38:37.376829 4852 generic.go:334] "Generic (PLEG): container finished" podID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerID="d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767" exitCode=0 Dec 10 12:38:37 crc kubenswrapper[4852]: I1210 12:38:37.377027 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xxq9" event={"ID":"5ec9d108-5532-402a-89fe-3baa753e38f5","Type":"ContainerDied","Data":"d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767"} Dec 10 12:38:37 crc kubenswrapper[4852]: I1210 12:38:37.377154 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xxq9" event={"ID":"5ec9d108-5532-402a-89fe-3baa753e38f5","Type":"ContainerStarted","Data":"9ce612bc69f5179847f90a07b26d29ae1cb9f6ae21321d36f62fbff5c9d10881"} Dec 10 12:38:38 crc kubenswrapper[4852]: I1210 12:38:38.387287 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xxq9" event={"ID":"5ec9d108-5532-402a-89fe-3baa753e38f5","Type":"ContainerStarted","Data":"78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f"} Dec 10 12:38:39 crc kubenswrapper[4852]: I1210 12:38:39.428967 4852 generic.go:334] "Generic (PLEG): container finished" podID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerID="78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f" exitCode=0 Dec 10 12:38:39 crc kubenswrapper[4852]: I1210 12:38:39.429295 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xxq9" event={"ID":"5ec9d108-5532-402a-89fe-3baa753e38f5","Type":"ContainerDied","Data":"78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f"} Dec 10 12:38:40 crc kubenswrapper[4852]: I1210 12:38:40.440456 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xxq9" event={"ID":"5ec9d108-5532-402a-89fe-3baa753e38f5","Type":"ContainerStarted","Data":"16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12"} Dec 10 12:38:40 crc kubenswrapper[4852]: I1210 12:38:40.464132 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9xxq9" podStartSLOduration=2.8074216549999997 podStartE2EDuration="5.46411389s" podCreationTimestamp="2025-12-10 12:38:35 +0000 UTC" firstStartedPulling="2025-12-10 12:38:37.37910486 +0000 UTC m=+2803.464630114" lastFinishedPulling="2025-12-10 12:38:40.035797125 +0000 UTC m=+2806.121322349" observedRunningTime="2025-12-10 12:38:40.461154216 +0000 UTC m=+2806.546679460" watchObservedRunningTime="2025-12-10 12:38:40.46411389 +0000 UTC m=+2806.549639114" Dec 10 12:38:45 crc kubenswrapper[4852]: I1210 12:38:45.849503 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:45 crc kubenswrapper[4852]: I1210 12:38:45.850163 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:45 crc kubenswrapper[4852]: I1210 12:38:45.908024 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:46 crc kubenswrapper[4852]: I1210 12:38:46.544963 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:46 crc kubenswrapper[4852]: I1210 12:38:46.598678 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9xxq9"] Dec 10 12:38:48 crc kubenswrapper[4852]: I1210 12:38:48.511917 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9xxq9" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerName="registry-server" containerID="cri-o://16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12" gracePeriod=2 Dec 10 12:38:48 crc kubenswrapper[4852]: I1210 12:38:48.977002 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.099312 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lqq4\" (UniqueName: \"kubernetes.io/projected/5ec9d108-5532-402a-89fe-3baa753e38f5-kube-api-access-4lqq4\") pod \"5ec9d108-5532-402a-89fe-3baa753e38f5\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.099579 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-catalog-content\") pod \"5ec9d108-5532-402a-89fe-3baa753e38f5\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.099611 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-utilities\") pod \"5ec9d108-5532-402a-89fe-3baa753e38f5\" (UID: \"5ec9d108-5532-402a-89fe-3baa753e38f5\") " Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.100411 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-utilities" (OuterVolumeSpecName: "utilities") pod "5ec9d108-5532-402a-89fe-3baa753e38f5" (UID: "5ec9d108-5532-402a-89fe-3baa753e38f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.106515 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec9d108-5532-402a-89fe-3baa753e38f5-kube-api-access-4lqq4" (OuterVolumeSpecName: "kube-api-access-4lqq4") pod "5ec9d108-5532-402a-89fe-3baa753e38f5" (UID: "5ec9d108-5532-402a-89fe-3baa753e38f5"). InnerVolumeSpecName "kube-api-access-4lqq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.150738 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ec9d108-5532-402a-89fe-3baa753e38f5" (UID: "5ec9d108-5532-402a-89fe-3baa753e38f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.201301 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lqq4\" (UniqueName: \"kubernetes.io/projected/5ec9d108-5532-402a-89fe-3baa753e38f5-kube-api-access-4lqq4\") on node \"crc\" DevicePath \"\"" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.201330 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.201339 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec9d108-5532-402a-89fe-3baa753e38f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.521129 4852 generic.go:334] "Generic (PLEG): container finished" podID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerID="16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12" exitCode=0 Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.521172 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xxq9" event={"ID":"5ec9d108-5532-402a-89fe-3baa753e38f5","Type":"ContainerDied","Data":"16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12"} Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.521198 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xxq9" event={"ID":"5ec9d108-5532-402a-89fe-3baa753e38f5","Type":"ContainerDied","Data":"9ce612bc69f5179847f90a07b26d29ae1cb9f6ae21321d36f62fbff5c9d10881"} Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.521215 4852 scope.go:117] "RemoveContainer" containerID="16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.521212 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9xxq9" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.542722 4852 scope.go:117] "RemoveContainer" containerID="78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.569865 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9xxq9"] Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.577237 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9xxq9"] Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.582249 4852 scope.go:117] "RemoveContainer" containerID="d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.607727 4852 scope.go:117] "RemoveContainer" containerID="16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12" Dec 10 12:38:49 crc kubenswrapper[4852]: E1210 12:38:49.609557 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12\": container with ID starting with 16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12 not found: ID does not exist" containerID="16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.609601 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12"} err="failed to get container status \"16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12\": rpc error: code = NotFound desc = could not find container \"16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12\": container with ID starting with 16900aa2d75b692162b0a427336efb9db7fa12c0fec93e0b729fedef95254b12 not found: ID does not exist" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.609628 4852 scope.go:117] "RemoveContainer" containerID="78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f" Dec 10 12:38:49 crc kubenswrapper[4852]: E1210 12:38:49.610010 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f\": container with ID starting with 78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f not found: ID does not exist" containerID="78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.610045 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f"} err="failed to get container status \"78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f\": rpc error: code = NotFound desc = could not find container \"78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f\": container with ID starting with 78610469a4fd1ae6362b3b174e5ffb78502dc833d4d03be380c775705051440f not found: ID does not exist" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.610068 4852 scope.go:117] "RemoveContainer" containerID="d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767" Dec 10 12:38:49 crc kubenswrapper[4852]: E1210 12:38:49.610361 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767\": container with ID starting with d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767 not found: ID does not exist" containerID="d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767" Dec 10 12:38:49 crc kubenswrapper[4852]: I1210 12:38:49.610397 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767"} err="failed to get container status \"d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767\": rpc error: code = NotFound desc = could not find container \"d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767\": container with ID starting with d6103672bf5669e2126efa71adb425c5930694b60f8167eb974a21cf963d2767 not found: ID does not exist" Dec 10 12:38:50 crc kubenswrapper[4852]: I1210 12:38:50.183529 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" path="/var/lib/kubelet/pods/5ec9d108-5532-402a-89fe-3baa753e38f5/volumes" Dec 10 12:39:14 crc kubenswrapper[4852]: I1210 12:39:14.782757 4852 generic.go:334] "Generic (PLEG): container finished" podID="4d0aea88-1cca-4e75-bc26-15c9f44d8682" containerID="ade6fb426f686daa073a2f7a69d61db6387d2c9c9d9994965464463c8e21d133" exitCode=0 Dec 10 12:39:14 crc kubenswrapper[4852]: I1210 12:39:14.783200 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" event={"ID":"4d0aea88-1cca-4e75-bc26-15c9f44d8682","Type":"ContainerDied","Data":"ade6fb426f686daa073a2f7a69d61db6387d2c9c9d9994965464463c8e21d133"} Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.228587 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350395 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-0\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350514 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r64dl\" (UniqueName: \"kubernetes.io/projected/4d0aea88-1cca-4e75-bc26-15c9f44d8682-kube-api-access-r64dl\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350579 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-combined-ca-bundle\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350633 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-ssh-key\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350688 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-1\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350725 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-extra-config-0\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350801 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-1\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350840 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-inventory\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.350913 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-0\") pod \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\" (UID: \"4d0aea88-1cca-4e75-bc26-15c9f44d8682\") " Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.358688 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0aea88-1cca-4e75-bc26-15c9f44d8682-kube-api-access-r64dl" (OuterVolumeSpecName: "kube-api-access-r64dl") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "kube-api-access-r64dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.383441 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.383536 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.386461 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.391784 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-inventory" (OuterVolumeSpecName: "inventory") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.392640 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.394021 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.400849 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.401799 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4d0aea88-1cca-4e75-bc26-15c9f44d8682" (UID: "4d0aea88-1cca-4e75-bc26-15c9f44d8682"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.452903 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r64dl\" (UniqueName: \"kubernetes.io/projected/4d0aea88-1cca-4e75-bc26-15c9f44d8682-kube-api-access-r64dl\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.452940 4852 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.452949 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.452959 4852 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.452970 4852 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.452978 4852 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.452986 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.452996 4852 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.453004 4852 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4d0aea88-1cca-4e75-bc26-15c9f44d8682-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.804533 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" event={"ID":"4d0aea88-1cca-4e75-bc26-15c9f44d8682","Type":"ContainerDied","Data":"fa42a224a49085fc6a3d826d8da6a2f320819827c93bcb33b62f6b0950e4f04b"} Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.804581 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa42a224a49085fc6a3d826d8da6a2f320819827c93bcb33b62f6b0950e4f04b" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.804646 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-bjrzh" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.912657 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb"] Dec 10 12:39:16 crc kubenswrapper[4852]: E1210 12:39:16.913089 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0aea88-1cca-4e75-bc26-15c9f44d8682" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.913114 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0aea88-1cca-4e75-bc26-15c9f44d8682" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 10 12:39:16 crc kubenswrapper[4852]: E1210 12:39:16.913147 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerName="registry-server" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.913158 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerName="registry-server" Dec 10 12:39:16 crc kubenswrapper[4852]: E1210 12:39:16.913177 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerName="extract-utilities" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.913185 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerName="extract-utilities" Dec 10 12:39:16 crc kubenswrapper[4852]: E1210 12:39:16.913203 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerName="extract-content" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.913210 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerName="extract-content" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.913434 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec9d108-5532-402a-89fe-3baa753e38f5" containerName="registry-server" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.913468 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0aea88-1cca-4e75-bc26-15c9f44d8682" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.914265 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.916107 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.916194 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-svh8h" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.917427 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.919497 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.920271 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 10 12:39:16 crc kubenswrapper[4852]: I1210 12:39:16.923486 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb"] Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.066956 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.067027 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.067056 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6292t\" (UniqueName: \"kubernetes.io/projected/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-kube-api-access-6292t\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.067079 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.067115 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.067166 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.067209 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.169370 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.169472 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.169508 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6292t\" (UniqueName: \"kubernetes.io/projected/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-kube-api-access-6292t\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.169557 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.171019 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.171168 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.171337 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.176413 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.176663 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.176855 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.177317 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.177747 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.178874 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.195680 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6292t\" (UniqueName: \"kubernetes.io/projected/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-kube-api-access-6292t\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.232079 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.803637 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb"] Dec 10 12:39:17 crc kubenswrapper[4852]: I1210 12:39:17.815679 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" event={"ID":"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2","Type":"ContainerStarted","Data":"a9a213e50c2706b3bdff69e732be36263de78a3c03d74d544ea31d0d374e2e6c"} Dec 10 12:39:19 crc kubenswrapper[4852]: I1210 12:39:19.838622 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" event={"ID":"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2","Type":"ContainerStarted","Data":"6d22eb7c7ca5f437bc4a1c5a32199bfa7cf4eb3932fc89b3b0071d5fff3c69cb"} Dec 10 12:39:19 crc kubenswrapper[4852]: I1210 12:39:19.860349 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" podStartSLOduration=2.251898922 podStartE2EDuration="3.860330568s" podCreationTimestamp="2025-12-10 12:39:16 +0000 UTC" firstStartedPulling="2025-12-10 12:39:17.802612452 +0000 UTC m=+2843.888137716" lastFinishedPulling="2025-12-10 12:39:19.411044138 +0000 UTC m=+2845.496569362" observedRunningTime="2025-12-10 12:39:19.857510167 +0000 UTC m=+2845.943035411" watchObservedRunningTime="2025-12-10 12:39:19.860330568 +0000 UTC m=+2845.945855802" Dec 10 12:39:45 crc kubenswrapper[4852]: I1210 12:39:45.790157 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:39:45 crc kubenswrapper[4852]: I1210 12:39:45.791521 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.383015 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wqzpw"] Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.390407 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.404559 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqzpw"] Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.494883 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wcj\" (UniqueName: \"kubernetes.io/projected/28aae2f0-7138-44e0-8374-a33ffaf53e67-kube-api-access-m7wcj\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.494954 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-catalog-content\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.495039 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-utilities\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.597037 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wcj\" (UniqueName: \"kubernetes.io/projected/28aae2f0-7138-44e0-8374-a33ffaf53e67-kube-api-access-m7wcj\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.597111 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-catalog-content\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.597206 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-utilities\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.597853 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-utilities\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.598282 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-catalog-content\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.620331 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wcj\" (UniqueName: \"kubernetes.io/projected/28aae2f0-7138-44e0-8374-a33ffaf53e67-kube-api-access-m7wcj\") pod \"community-operators-wqzpw\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:48 crc kubenswrapper[4852]: I1210 12:39:48.722056 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:49 crc kubenswrapper[4852]: I1210 12:39:49.119849 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqzpw"] Dec 10 12:39:49 crc kubenswrapper[4852]: W1210 12:39:49.121528 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28aae2f0_7138_44e0_8374_a33ffaf53e67.slice/crio-e7032fabfdc5b03de3a76039da605a3c7258fb60a3d1967b1040bd1360e46bf8 WatchSource:0}: Error finding container e7032fabfdc5b03de3a76039da605a3c7258fb60a3d1967b1040bd1360e46bf8: Status 404 returned error can't find the container with id e7032fabfdc5b03de3a76039da605a3c7258fb60a3d1967b1040bd1360e46bf8 Dec 10 12:39:50 crc kubenswrapper[4852]: I1210 12:39:50.138057 4852 generic.go:334] "Generic (PLEG): container finished" podID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerID="e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6" exitCode=0 Dec 10 12:39:50 crc kubenswrapper[4852]: I1210 12:39:50.138159 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqzpw" event={"ID":"28aae2f0-7138-44e0-8374-a33ffaf53e67","Type":"ContainerDied","Data":"e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6"} Dec 10 12:39:50 crc kubenswrapper[4852]: I1210 12:39:50.139321 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqzpw" event={"ID":"28aae2f0-7138-44e0-8374-a33ffaf53e67","Type":"ContainerStarted","Data":"e7032fabfdc5b03de3a76039da605a3c7258fb60a3d1967b1040bd1360e46bf8"} Dec 10 12:39:52 crc kubenswrapper[4852]: I1210 12:39:52.166681 4852 generic.go:334] "Generic (PLEG): container finished" podID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerID="c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009" exitCode=0 Dec 10 12:39:52 crc kubenswrapper[4852]: I1210 12:39:52.167219 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqzpw" event={"ID":"28aae2f0-7138-44e0-8374-a33ffaf53e67","Type":"ContainerDied","Data":"c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009"} Dec 10 12:39:54 crc kubenswrapper[4852]: I1210 12:39:54.201707 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqzpw" event={"ID":"28aae2f0-7138-44e0-8374-a33ffaf53e67","Type":"ContainerStarted","Data":"e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a"} Dec 10 12:39:55 crc kubenswrapper[4852]: I1210 12:39:55.229774 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wqzpw" podStartSLOduration=4.073186414 podStartE2EDuration="7.229758505s" podCreationTimestamp="2025-12-10 12:39:48 +0000 UTC" firstStartedPulling="2025-12-10 12:39:50.140894999 +0000 UTC m=+2876.226420263" lastFinishedPulling="2025-12-10 12:39:53.29746713 +0000 UTC m=+2879.382992354" observedRunningTime="2025-12-10 12:39:55.228092964 +0000 UTC m=+2881.313618188" watchObservedRunningTime="2025-12-10 12:39:55.229758505 +0000 UTC m=+2881.315283729" Dec 10 12:39:58 crc kubenswrapper[4852]: I1210 12:39:58.722314 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:58 crc kubenswrapper[4852]: I1210 12:39:58.722904 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:58 crc kubenswrapper[4852]: I1210 12:39:58.786041 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:59 crc kubenswrapper[4852]: I1210 12:39:59.305701 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:39:59 crc kubenswrapper[4852]: I1210 12:39:59.356280 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wqzpw"] Dec 10 12:40:01 crc kubenswrapper[4852]: I1210 12:40:01.294635 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wqzpw" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerName="registry-server" containerID="cri-o://e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a" gracePeriod=2 Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.220000 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.300015 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-catalog-content\") pod \"28aae2f0-7138-44e0-8374-a33ffaf53e67\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.301314 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7wcj\" (UniqueName: \"kubernetes.io/projected/28aae2f0-7138-44e0-8374-a33ffaf53e67-kube-api-access-m7wcj\") pod \"28aae2f0-7138-44e0-8374-a33ffaf53e67\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.301493 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-utilities\") pod \"28aae2f0-7138-44e0-8374-a33ffaf53e67\" (UID: \"28aae2f0-7138-44e0-8374-a33ffaf53e67\") " Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.303129 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-utilities" (OuterVolumeSpecName: "utilities") pod "28aae2f0-7138-44e0-8374-a33ffaf53e67" (UID: "28aae2f0-7138-44e0-8374-a33ffaf53e67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.303900 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.306536 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28aae2f0-7138-44e0-8374-a33ffaf53e67-kube-api-access-m7wcj" (OuterVolumeSpecName: "kube-api-access-m7wcj") pod "28aae2f0-7138-44e0-8374-a33ffaf53e67" (UID: "28aae2f0-7138-44e0-8374-a33ffaf53e67"). InnerVolumeSpecName "kube-api-access-m7wcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.310935 4852 generic.go:334] "Generic (PLEG): container finished" podID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerID="e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a" exitCode=0 Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.310976 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqzpw" event={"ID":"28aae2f0-7138-44e0-8374-a33ffaf53e67","Type":"ContainerDied","Data":"e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a"} Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.311008 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqzpw" event={"ID":"28aae2f0-7138-44e0-8374-a33ffaf53e67","Type":"ContainerDied","Data":"e7032fabfdc5b03de3a76039da605a3c7258fb60a3d1967b1040bd1360e46bf8"} Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.311024 4852 scope.go:117] "RemoveContainer" containerID="e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.311168 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqzpw" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.367641 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28aae2f0-7138-44e0-8374-a33ffaf53e67" (UID: "28aae2f0-7138-44e0-8374-a33ffaf53e67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.375322 4852 scope.go:117] "RemoveContainer" containerID="c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.397731 4852 scope.go:117] "RemoveContainer" containerID="e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.405784 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28aae2f0-7138-44e0-8374-a33ffaf53e67-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.405821 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7wcj\" (UniqueName: \"kubernetes.io/projected/28aae2f0-7138-44e0-8374-a33ffaf53e67-kube-api-access-m7wcj\") on node \"crc\" DevicePath \"\"" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.439694 4852 scope.go:117] "RemoveContainer" containerID="e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a" Dec 10 12:40:02 crc kubenswrapper[4852]: E1210 12:40:02.440154 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a\": container with ID starting with e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a not found: ID does not exist" containerID="e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.440183 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a"} err="failed to get container status \"e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a\": rpc error: code = NotFound desc = could not find container \"e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a\": container with ID starting with e3677eed24b13a90b3454bba8d028381da3b0112c8241250c2a061c82fd8ed7a not found: ID does not exist" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.440205 4852 scope.go:117] "RemoveContainer" containerID="c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009" Dec 10 12:40:02 crc kubenswrapper[4852]: E1210 12:40:02.440573 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009\": container with ID starting with c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009 not found: ID does not exist" containerID="c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.440593 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009"} err="failed to get container status \"c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009\": rpc error: code = NotFound desc = could not find container \"c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009\": container with ID starting with c0e0bc134374c61e9411fc05cf77871b0e854a774f64f166a47e757211b1b009 not found: ID does not exist" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.440607 4852 scope.go:117] "RemoveContainer" containerID="e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6" Dec 10 12:40:02 crc kubenswrapper[4852]: E1210 12:40:02.440808 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6\": container with ID starting with e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6 not found: ID does not exist" containerID="e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.440832 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6"} err="failed to get container status \"e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6\": rpc error: code = NotFound desc = could not find container \"e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6\": container with ID starting with e46960c50b42c598aa5fbb6cdca83b5cb9a4855a9a30d586e4ab9fe89678cbb6 not found: ID does not exist" Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.651942 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wqzpw"] Dec 10 12:40:02 crc kubenswrapper[4852]: I1210 12:40:02.659092 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wqzpw"] Dec 10 12:40:04 crc kubenswrapper[4852]: I1210 12:40:04.182992 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" path="/var/lib/kubelet/pods/28aae2f0-7138-44e0-8374-a33ffaf53e67/volumes" Dec 10 12:40:15 crc kubenswrapper[4852]: I1210 12:40:15.790407 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:40:15 crc kubenswrapper[4852]: I1210 12:40:15.790978 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:40:45 crc kubenswrapper[4852]: I1210 12:40:45.790752 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:40:45 crc kubenswrapper[4852]: I1210 12:40:45.791430 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:40:45 crc kubenswrapper[4852]: I1210 12:40:45.791500 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:40:45 crc kubenswrapper[4852]: I1210 12:40:45.792642 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9d8e5eeae465e987e4c229fef97a1d891d93af600c6d4356b1fceb18f67fed4"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:40:45 crc kubenswrapper[4852]: I1210 12:40:45.792735 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://e9d8e5eeae465e987e4c229fef97a1d891d93af600c6d4356b1fceb18f67fed4" gracePeriod=600 Dec 10 12:40:46 crc kubenswrapper[4852]: I1210 12:40:46.774548 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="e9d8e5eeae465e987e4c229fef97a1d891d93af600c6d4356b1fceb18f67fed4" exitCode=0 Dec 10 12:40:46 crc kubenswrapper[4852]: I1210 12:40:46.774637 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"e9d8e5eeae465e987e4c229fef97a1d891d93af600c6d4356b1fceb18f67fed4"} Dec 10 12:40:46 crc kubenswrapper[4852]: I1210 12:40:46.775226 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f"} Dec 10 12:40:46 crc kubenswrapper[4852]: I1210 12:40:46.775279 4852 scope.go:117] "RemoveContainer" containerID="135cbe62465ab769b34317ee2c65ad81e8194ce6fe8696b5a689bfd231169fbd" Dec 10 12:41:43 crc kubenswrapper[4852]: I1210 12:41:43.357197 4852 generic.go:334] "Generic (PLEG): container finished" podID="33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" containerID="6d22eb7c7ca5f437bc4a1c5a32199bfa7cf4eb3932fc89b3b0071d5fff3c69cb" exitCode=0 Dec 10 12:41:43 crc kubenswrapper[4852]: I1210 12:41:43.357278 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" event={"ID":"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2","Type":"ContainerDied","Data":"6d22eb7c7ca5f437bc4a1c5a32199bfa7cf4eb3932fc89b3b0071d5fff3c69cb"} Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.809448 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.874094 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-0\") pod \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.874136 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-telemetry-combined-ca-bundle\") pod \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.874170 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-1\") pod \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.874261 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ssh-key\") pod \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.874279 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-2\") pod \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.874332 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6292t\" (UniqueName: \"kubernetes.io/projected/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-kube-api-access-6292t\") pod \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.874380 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-inventory\") pod \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\" (UID: \"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2\") " Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.879827 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-kube-api-access-6292t" (OuterVolumeSpecName: "kube-api-access-6292t") pod "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" (UID: "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2"). InnerVolumeSpecName "kube-api-access-6292t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.882345 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" (UID: "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.901419 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" (UID: "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.906896 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-inventory" (OuterVolumeSpecName: "inventory") pod "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" (UID: "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.906913 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" (UID: "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.908241 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" (UID: "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.910703 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" (UID: "33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.977011 4852 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.977046 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.977061 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6292t\" (UniqueName: \"kubernetes.io/projected/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-kube-api-access-6292t\") on node \"crc\" DevicePath \"\"" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.977070 4852 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-inventory\") on node \"crc\" DevicePath \"\"" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.977080 4852 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.977089 4852 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 12:41:44 crc kubenswrapper[4852]: I1210 12:41:44.977100 4852 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 10 12:41:45 crc kubenswrapper[4852]: I1210 12:41:45.380150 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" event={"ID":"33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2","Type":"ContainerDied","Data":"a9a213e50c2706b3bdff69e732be36263de78a3c03d74d544ea31d0d374e2e6c"} Dec 10 12:41:45 crc kubenswrapper[4852]: I1210 12:41:45.380220 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a213e50c2706b3bdff69e732be36263de78a3c03d74d544ea31d0d374e2e6c" Dec 10 12:41:45 crc kubenswrapper[4852]: I1210 12:41:45.380314 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.738797 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 10 12:42:28 crc kubenswrapper[4852]: E1210 12:42:28.739615 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerName="extract-utilities" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.739627 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerName="extract-utilities" Dec 10 12:42:28 crc kubenswrapper[4852]: E1210 12:42:28.739640 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerName="extract-content" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.739647 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerName="extract-content" Dec 10 12:42:28 crc kubenswrapper[4852]: E1210 12:42:28.739662 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.739670 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 10 12:42:28 crc kubenswrapper[4852]: E1210 12:42:28.739699 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerName="registry-server" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.739704 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerName="registry-server" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.739876 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.739891 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="28aae2f0-7138-44e0-8374-a33ffaf53e67" containerName="registry-server" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.740473 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.745172 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.745348 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.745511 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.745614 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-t56dj" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.755556 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.967436 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.967829 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:28 crc kubenswrapper[4852]: I1210 12:42:28.967933 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-config-data\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.069328 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.069381 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptrn5\" (UniqueName: \"kubernetes.io/projected/9238ddbd-fcdf-4612-974f-114508e02356-kube-api-access-ptrn5\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.069493 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.069543 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-config-data\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.069609 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.069775 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.069811 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.069992 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.070020 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.072635 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.073880 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-config-data\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.079098 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.171950 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.172416 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.172608 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.172735 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptrn5\" (UniqueName: \"kubernetes.io/projected/9238ddbd-fcdf-4612-974f-114508e02356-kube-api-access-ptrn5\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.172857 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.173007 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.173068 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.173715 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.174276 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.177323 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.178926 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.204802 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptrn5\" (UniqueName: \"kubernetes.io/projected/9238ddbd-fcdf-4612-974f-114508e02356-kube-api-access-ptrn5\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.209949 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.295009 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.576808 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.583856 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:42:29 crc kubenswrapper[4852]: I1210 12:42:29.984698 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9238ddbd-fcdf-4612-974f-114508e02356","Type":"ContainerStarted","Data":"0b3e7971d7c33f7a34aef85ddb1dc8dc5a5ffd15711c7bd4caae8e47b9522c69"} Dec 10 12:43:11 crc kubenswrapper[4852]: E1210 12:43:11.121069 4852 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 10 12:43:11 crc kubenswrapper[4852]: E1210 12:43:11.121745 4852 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptrn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(9238ddbd-fcdf-4612-974f-114508e02356): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 10 12:43:11 crc kubenswrapper[4852]: E1210 12:43:11.123072 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="9238ddbd-fcdf-4612-974f-114508e02356" Dec 10 12:43:11 crc kubenswrapper[4852]: E1210 12:43:11.390551 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="9238ddbd-fcdf-4612-974f-114508e02356" Dec 10 12:43:15 crc kubenswrapper[4852]: I1210 12:43:15.790451 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:43:15 crc kubenswrapper[4852]: I1210 12:43:15.790961 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:43:25 crc kubenswrapper[4852]: I1210 12:43:25.685841 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 10 12:43:27 crc kubenswrapper[4852]: I1210 12:43:27.533100 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9238ddbd-fcdf-4612-974f-114508e02356","Type":"ContainerStarted","Data":"977fa0b6f0594996e73a8e8838e11a27c80263f27209fde0a338000fef367fd9"} Dec 10 12:43:27 crc kubenswrapper[4852]: I1210 12:43:27.560928 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.461752055 podStartE2EDuration="1m0.560907692s" podCreationTimestamp="2025-12-10 12:42:27 +0000 UTC" firstStartedPulling="2025-12-10 12:42:29.583681499 +0000 UTC m=+3035.669206723" lastFinishedPulling="2025-12-10 12:43:25.682837116 +0000 UTC m=+3091.768362360" observedRunningTime="2025-12-10 12:43:27.552962571 +0000 UTC m=+3093.638487795" watchObservedRunningTime="2025-12-10 12:43:27.560907692 +0000 UTC m=+3093.646432926" Dec 10 12:43:41 crc kubenswrapper[4852]: E1210 12:43:41.014080 4852 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 10 12:43:45 crc kubenswrapper[4852]: I1210 12:43:45.790249 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:43:45 crc kubenswrapper[4852]: I1210 12:43:45.790778 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.790528 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.791103 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.791162 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.791944 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.791990 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" gracePeriod=600 Dec 10 12:44:15 crc kubenswrapper[4852]: E1210 12:44:15.915596 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.984259 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" exitCode=0 Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.984305 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f"} Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.984338 4852 scope.go:117] "RemoveContainer" containerID="e9d8e5eeae465e987e4c229fef97a1d891d93af600c6d4356b1fceb18f67fed4" Dec 10 12:44:15 crc kubenswrapper[4852]: I1210 12:44:15.985455 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:44:15 crc kubenswrapper[4852]: E1210 12:44:15.986032 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:44:30 crc kubenswrapper[4852]: I1210 12:44:30.170938 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:44:30 crc kubenswrapper[4852]: E1210 12:44:30.171557 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:44:44 crc kubenswrapper[4852]: I1210 12:44:44.185419 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:44:44 crc kubenswrapper[4852]: E1210 12:44:44.186384 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:44:58 crc kubenswrapper[4852]: I1210 12:44:58.170258 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:44:58 crc kubenswrapper[4852]: E1210 12:44:58.171043 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.151047 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm"] Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.152736 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.156320 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.156688 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.165135 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm"] Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.318208 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a72a146-cf73-44ad-b022-95978ec07da7-secret-volume\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.318308 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a72a146-cf73-44ad-b022-95978ec07da7-config-volume\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.318431 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbz4j\" (UniqueName: \"kubernetes.io/projected/8a72a146-cf73-44ad-b022-95978ec07da7-kube-api-access-jbz4j\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.419947 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbz4j\" (UniqueName: \"kubernetes.io/projected/8a72a146-cf73-44ad-b022-95978ec07da7-kube-api-access-jbz4j\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.420101 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a72a146-cf73-44ad-b022-95978ec07da7-secret-volume\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.421008 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a72a146-cf73-44ad-b022-95978ec07da7-config-volume\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.421777 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a72a146-cf73-44ad-b022-95978ec07da7-config-volume\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.430265 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a72a146-cf73-44ad-b022-95978ec07da7-secret-volume\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.438186 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbz4j\" (UniqueName: \"kubernetes.io/projected/8a72a146-cf73-44ad-b022-95978ec07da7-kube-api-access-jbz4j\") pod \"collect-profiles-29422845-bk9fm\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.486137 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:00 crc kubenswrapper[4852]: I1210 12:45:00.966359 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm"] Dec 10 12:45:01 crc kubenswrapper[4852]: I1210 12:45:01.390993 4852 generic.go:334] "Generic (PLEG): container finished" podID="8a72a146-cf73-44ad-b022-95978ec07da7" containerID="c23bde359ed2ce7ed9f5b528813f2000c52e9cce783f3938b3b080aa5eaff06f" exitCode=0 Dec 10 12:45:01 crc kubenswrapper[4852]: I1210 12:45:01.391157 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" event={"ID":"8a72a146-cf73-44ad-b022-95978ec07da7","Type":"ContainerDied","Data":"c23bde359ed2ce7ed9f5b528813f2000c52e9cce783f3938b3b080aa5eaff06f"} Dec 10 12:45:01 crc kubenswrapper[4852]: I1210 12:45:01.392011 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" event={"ID":"8a72a146-cf73-44ad-b022-95978ec07da7","Type":"ContainerStarted","Data":"34bb856464c38ec6c134cad509a64074dd02bb696f3decd4e51b205ae10f33af"} Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.774353 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.867425 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbz4j\" (UniqueName: \"kubernetes.io/projected/8a72a146-cf73-44ad-b022-95978ec07da7-kube-api-access-jbz4j\") pod \"8a72a146-cf73-44ad-b022-95978ec07da7\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.867617 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a72a146-cf73-44ad-b022-95978ec07da7-secret-volume\") pod \"8a72a146-cf73-44ad-b022-95978ec07da7\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.867845 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a72a146-cf73-44ad-b022-95978ec07da7-config-volume\") pod \"8a72a146-cf73-44ad-b022-95978ec07da7\" (UID: \"8a72a146-cf73-44ad-b022-95978ec07da7\") " Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.869098 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a72a146-cf73-44ad-b022-95978ec07da7-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a72a146-cf73-44ad-b022-95978ec07da7" (UID: "8a72a146-cf73-44ad-b022-95978ec07da7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.875384 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a72a146-cf73-44ad-b022-95978ec07da7-kube-api-access-jbz4j" (OuterVolumeSpecName: "kube-api-access-jbz4j") pod "8a72a146-cf73-44ad-b022-95978ec07da7" (UID: "8a72a146-cf73-44ad-b022-95978ec07da7"). InnerVolumeSpecName "kube-api-access-jbz4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.877414 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a72a146-cf73-44ad-b022-95978ec07da7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a72a146-cf73-44ad-b022-95978ec07da7" (UID: "8a72a146-cf73-44ad-b022-95978ec07da7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.969542 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbz4j\" (UniqueName: \"kubernetes.io/projected/8a72a146-cf73-44ad-b022-95978ec07da7-kube-api-access-jbz4j\") on node \"crc\" DevicePath \"\"" Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.969574 4852 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a72a146-cf73-44ad-b022-95978ec07da7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 10 12:45:02 crc kubenswrapper[4852]: I1210 12:45:02.969585 4852 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a72a146-cf73-44ad-b022-95978ec07da7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 10 12:45:03 crc kubenswrapper[4852]: I1210 12:45:03.410373 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" event={"ID":"8a72a146-cf73-44ad-b022-95978ec07da7","Type":"ContainerDied","Data":"34bb856464c38ec6c134cad509a64074dd02bb696f3decd4e51b205ae10f33af"} Dec 10 12:45:03 crc kubenswrapper[4852]: I1210 12:45:03.410659 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34bb856464c38ec6c134cad509a64074dd02bb696f3decd4e51b205ae10f33af" Dec 10 12:45:03 crc kubenswrapper[4852]: I1210 12:45:03.410416 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422845-bk9fm" Dec 10 12:45:03 crc kubenswrapper[4852]: I1210 12:45:03.854120 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd"] Dec 10 12:45:03 crc kubenswrapper[4852]: I1210 12:45:03.864132 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422800-w87qd"] Dec 10 12:45:04 crc kubenswrapper[4852]: I1210 12:45:04.272492 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3203c694-b4b0-43d8-a728-8a0804346e1c" path="/var/lib/kubelet/pods/3203c694-b4b0-43d8-a728-8a0804346e1c/volumes" Dec 10 12:45:06 crc kubenswrapper[4852]: I1210 12:45:06.243015 4852 scope.go:117] "RemoveContainer" containerID="2f4e481241117b39520e59cda5b229c4110bab117172284f52ea06f0aadd3a7c" Dec 10 12:45:10 crc kubenswrapper[4852]: I1210 12:45:10.173277 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:45:10 crc kubenswrapper[4852]: E1210 12:45:10.174984 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:45:25 crc kubenswrapper[4852]: I1210 12:45:25.169763 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:45:25 crc kubenswrapper[4852]: E1210 12:45:25.173023 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:45:36 crc kubenswrapper[4852]: I1210 12:45:36.170928 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:45:36 crc kubenswrapper[4852]: E1210 12:45:36.172022 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:45:51 crc kubenswrapper[4852]: I1210 12:45:51.170353 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:45:51 crc kubenswrapper[4852]: E1210 12:45:51.171105 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:46:02 crc kubenswrapper[4852]: I1210 12:46:02.169752 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:46:02 crc kubenswrapper[4852]: E1210 12:46:02.170628 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:46:17 crc kubenswrapper[4852]: I1210 12:46:17.170990 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:46:17 crc kubenswrapper[4852]: E1210 12:46:17.172778 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:46:31 crc kubenswrapper[4852]: I1210 12:46:31.170507 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:46:31 crc kubenswrapper[4852]: E1210 12:46:31.171326 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:46:46 crc kubenswrapper[4852]: I1210 12:46:46.171565 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:46:46 crc kubenswrapper[4852]: E1210 12:46:46.172392 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:47:01 crc kubenswrapper[4852]: I1210 12:47:01.171059 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:47:01 crc kubenswrapper[4852]: E1210 12:47:01.171637 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:47:12 crc kubenswrapper[4852]: I1210 12:47:12.174517 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:47:12 crc kubenswrapper[4852]: E1210 12:47:12.175270 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:47:27 crc kubenswrapper[4852]: I1210 12:47:27.170640 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:47:27 crc kubenswrapper[4852]: E1210 12:47:27.171765 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:47:39 crc kubenswrapper[4852]: I1210 12:47:39.169524 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:47:39 crc kubenswrapper[4852]: E1210 12:47:39.170344 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.170267 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:47:50 crc kubenswrapper[4852]: E1210 12:47:50.172310 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.807073 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mzdcc"] Dec 10 12:47:50 crc kubenswrapper[4852]: E1210 12:47:50.807506 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a72a146-cf73-44ad-b022-95978ec07da7" containerName="collect-profiles" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.807523 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a72a146-cf73-44ad-b022-95978ec07da7" containerName="collect-profiles" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.807700 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a72a146-cf73-44ad-b022-95978ec07da7" containerName="collect-profiles" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.809169 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.823129 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mzdcc"] Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.876596 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-catalog-content\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.876698 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zbkr\" (UniqueName: \"kubernetes.io/projected/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-kube-api-access-5zbkr\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.876786 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-utilities\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.979806 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-catalog-content\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.979933 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zbkr\" (UniqueName: \"kubernetes.io/projected/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-kube-api-access-5zbkr\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.980001 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-utilities\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.980898 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-utilities\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:50 crc kubenswrapper[4852]: I1210 12:47:50.981273 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-catalog-content\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:51 crc kubenswrapper[4852]: I1210 12:47:51.028116 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zbkr\" (UniqueName: \"kubernetes.io/projected/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-kube-api-access-5zbkr\") pod \"redhat-operators-mzdcc\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:51 crc kubenswrapper[4852]: I1210 12:47:51.133514 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:47:51 crc kubenswrapper[4852]: I1210 12:47:51.641308 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mzdcc"] Dec 10 12:47:51 crc kubenswrapper[4852]: I1210 12:47:51.974729 4852 generic.go:334] "Generic (PLEG): container finished" podID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerID="377cdcbd6a8cc449c2ccbadde6e58064734252017404e6802fe06a0e29a3fedc" exitCode=0 Dec 10 12:47:51 crc kubenswrapper[4852]: I1210 12:47:51.974818 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzdcc" event={"ID":"83c4f228-7494-47b3-b0f7-7d8c5c0edc63","Type":"ContainerDied","Data":"377cdcbd6a8cc449c2ccbadde6e58064734252017404e6802fe06a0e29a3fedc"} Dec 10 12:47:51 crc kubenswrapper[4852]: I1210 12:47:51.975432 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzdcc" event={"ID":"83c4f228-7494-47b3-b0f7-7d8c5c0edc63","Type":"ContainerStarted","Data":"6337c68af32a3bbeffb0d37b79daf435177dd43f5b60db5bc0a79563bc9baafc"} Dec 10 12:47:51 crc kubenswrapper[4852]: I1210 12:47:51.977222 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:47:52 crc kubenswrapper[4852]: I1210 12:47:52.986392 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzdcc" event={"ID":"83c4f228-7494-47b3-b0f7-7d8c5c0edc63","Type":"ContainerStarted","Data":"971ce918c26657b5156519d08948d2a8e7bab7634f56471ba4d08c6be7c198ff"} Dec 10 12:47:53 crc kubenswrapper[4852]: I1210 12:47:53.997468 4852 generic.go:334] "Generic (PLEG): container finished" podID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerID="971ce918c26657b5156519d08948d2a8e7bab7634f56471ba4d08c6be7c198ff" exitCode=0 Dec 10 12:47:53 crc kubenswrapper[4852]: I1210 12:47:53.997520 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzdcc" event={"ID":"83c4f228-7494-47b3-b0f7-7d8c5c0edc63","Type":"ContainerDied","Data":"971ce918c26657b5156519d08948d2a8e7bab7634f56471ba4d08c6be7c198ff"} Dec 10 12:47:55 crc kubenswrapper[4852]: I1210 12:47:55.006902 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzdcc" event={"ID":"83c4f228-7494-47b3-b0f7-7d8c5c0edc63","Type":"ContainerStarted","Data":"d2c38abdea14b4a7256094630ffa526970723f4f4ec1424d34cb1c6031e396ab"} Dec 10 12:47:55 crc kubenswrapper[4852]: I1210 12:47:55.031040 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mzdcc" podStartSLOduration=2.393240841 podStartE2EDuration="5.03102093s" podCreationTimestamp="2025-12-10 12:47:50 +0000 UTC" firstStartedPulling="2025-12-10 12:47:51.976978236 +0000 UTC m=+3358.062503460" lastFinishedPulling="2025-12-10 12:47:54.614758325 +0000 UTC m=+3360.700283549" observedRunningTime="2025-12-10 12:47:55.025774969 +0000 UTC m=+3361.111300203" watchObservedRunningTime="2025-12-10 12:47:55.03102093 +0000 UTC m=+3361.116546154" Dec 10 12:48:01 crc kubenswrapper[4852]: I1210 12:48:01.134479 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:48:01 crc kubenswrapper[4852]: I1210 12:48:01.135103 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:48:01 crc kubenswrapper[4852]: I1210 12:48:01.185084 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:48:02 crc kubenswrapper[4852]: I1210 12:48:02.137377 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:48:02 crc kubenswrapper[4852]: I1210 12:48:02.189257 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mzdcc"] Dec 10 12:48:04 crc kubenswrapper[4852]: I1210 12:48:04.095014 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mzdcc" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerName="registry-server" containerID="cri-o://d2c38abdea14b4a7256094630ffa526970723f4f4ec1424d34cb1c6031e396ab" gracePeriod=2 Dec 10 12:48:04 crc kubenswrapper[4852]: I1210 12:48:04.179397 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:48:04 crc kubenswrapper[4852]: E1210 12:48:04.180125 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.116545 4852 generic.go:334] "Generic (PLEG): container finished" podID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerID="d2c38abdea14b4a7256094630ffa526970723f4f4ec1424d34cb1c6031e396ab" exitCode=0 Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.116939 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzdcc" event={"ID":"83c4f228-7494-47b3-b0f7-7d8c5c0edc63","Type":"ContainerDied","Data":"d2c38abdea14b4a7256094630ffa526970723f4f4ec1424d34cb1c6031e396ab"} Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.535668 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.677955 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zbkr\" (UniqueName: \"kubernetes.io/projected/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-kube-api-access-5zbkr\") pod \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.678121 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-utilities\") pod \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.678167 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-catalog-content\") pod \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\" (UID: \"83c4f228-7494-47b3-b0f7-7d8c5c0edc63\") " Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.678996 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-utilities" (OuterVolumeSpecName: "utilities") pod "83c4f228-7494-47b3-b0f7-7d8c5c0edc63" (UID: "83c4f228-7494-47b3-b0f7-7d8c5c0edc63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.684654 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-kube-api-access-5zbkr" (OuterVolumeSpecName: "kube-api-access-5zbkr") pod "83c4f228-7494-47b3-b0f7-7d8c5c0edc63" (UID: "83c4f228-7494-47b3-b0f7-7d8c5c0edc63"). InnerVolumeSpecName "kube-api-access-5zbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.781144 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.781179 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zbkr\" (UniqueName: \"kubernetes.io/projected/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-kube-api-access-5zbkr\") on node \"crc\" DevicePath \"\"" Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.788950 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83c4f228-7494-47b3-b0f7-7d8c5c0edc63" (UID: "83c4f228-7494-47b3-b0f7-7d8c5c0edc63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:48:06 crc kubenswrapper[4852]: I1210 12:48:06.882929 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c4f228-7494-47b3-b0f7-7d8c5c0edc63-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:48:07 crc kubenswrapper[4852]: I1210 12:48:07.130569 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzdcc" event={"ID":"83c4f228-7494-47b3-b0f7-7d8c5c0edc63","Type":"ContainerDied","Data":"6337c68af32a3bbeffb0d37b79daf435177dd43f5b60db5bc0a79563bc9baafc"} Dec 10 12:48:07 crc kubenswrapper[4852]: I1210 12:48:07.130626 4852 scope.go:117] "RemoveContainer" containerID="d2c38abdea14b4a7256094630ffa526970723f4f4ec1424d34cb1c6031e396ab" Dec 10 12:48:07 crc kubenswrapper[4852]: I1210 12:48:07.130680 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzdcc" Dec 10 12:48:07 crc kubenswrapper[4852]: I1210 12:48:07.152294 4852 scope.go:117] "RemoveContainer" containerID="971ce918c26657b5156519d08948d2a8e7bab7634f56471ba4d08c6be7c198ff" Dec 10 12:48:07 crc kubenswrapper[4852]: I1210 12:48:07.185065 4852 scope.go:117] "RemoveContainer" containerID="377cdcbd6a8cc449c2ccbadde6e58064734252017404e6802fe06a0e29a3fedc" Dec 10 12:48:07 crc kubenswrapper[4852]: I1210 12:48:07.196978 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mzdcc"] Dec 10 12:48:07 crc kubenswrapper[4852]: I1210 12:48:07.238541 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mzdcc"] Dec 10 12:48:08 crc kubenswrapper[4852]: I1210 12:48:08.183944 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" path="/var/lib/kubelet/pods/83c4f228-7494-47b3-b0f7-7d8c5c0edc63/volumes" Dec 10 12:48:16 crc kubenswrapper[4852]: I1210 12:48:16.170706 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:48:16 crc kubenswrapper[4852]: E1210 12:48:16.171522 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:48:29 crc kubenswrapper[4852]: I1210 12:48:29.170574 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:48:29 crc kubenswrapper[4852]: E1210 12:48:29.171571 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.172773 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vltkv"] Dec 10 12:48:39 crc kubenswrapper[4852]: E1210 12:48:39.174054 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerName="registry-server" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.174072 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerName="registry-server" Dec 10 12:48:39 crc kubenswrapper[4852]: E1210 12:48:39.174099 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerName="extract-utilities" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.174106 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerName="extract-utilities" Dec 10 12:48:39 crc kubenswrapper[4852]: E1210 12:48:39.174121 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerName="extract-content" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.174128 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerName="extract-content" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.174362 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c4f228-7494-47b3-b0f7-7d8c5c0edc63" containerName="registry-server" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.176005 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.193195 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vltkv"] Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.310290 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzw9\" (UniqueName: \"kubernetes.io/projected/5e3c1e36-6ba2-4297-9488-8e0b59774844-kube-api-access-gvzw9\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.310404 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-utilities\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.310455 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-catalog-content\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.412646 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzw9\" (UniqueName: \"kubernetes.io/projected/5e3c1e36-6ba2-4297-9488-8e0b59774844-kube-api-access-gvzw9\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.412736 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-utilities\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.412773 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-catalog-content\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.413373 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-utilities\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.413533 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-catalog-content\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.432461 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzw9\" (UniqueName: \"kubernetes.io/projected/5e3c1e36-6ba2-4297-9488-8e0b59774844-kube-api-access-gvzw9\") pod \"redhat-marketplace-vltkv\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.504099 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:39 crc kubenswrapper[4852]: I1210 12:48:39.945213 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vltkv"] Dec 10 12:48:40 crc kubenswrapper[4852]: I1210 12:48:40.431352 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vltkv" event={"ID":"5e3c1e36-6ba2-4297-9488-8e0b59774844","Type":"ContainerStarted","Data":"9ac399a8c284358732b6b9509f5dc562214272f010f56c57ddada26dfd49e9f5"} Dec 10 12:48:41 crc kubenswrapper[4852]: I1210 12:48:41.452984 4852 generic.go:334] "Generic (PLEG): container finished" podID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerID="5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8" exitCode=0 Dec 10 12:48:41 crc kubenswrapper[4852]: I1210 12:48:41.453101 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vltkv" event={"ID":"5e3c1e36-6ba2-4297-9488-8e0b59774844","Type":"ContainerDied","Data":"5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8"} Dec 10 12:48:42 crc kubenswrapper[4852]: I1210 12:48:42.462266 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vltkv" event={"ID":"5e3c1e36-6ba2-4297-9488-8e0b59774844","Type":"ContainerStarted","Data":"5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8"} Dec 10 12:48:43 crc kubenswrapper[4852]: I1210 12:48:43.477705 4852 generic.go:334] "Generic (PLEG): container finished" podID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerID="5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8" exitCode=0 Dec 10 12:48:43 crc kubenswrapper[4852]: I1210 12:48:43.478067 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vltkv" event={"ID":"5e3c1e36-6ba2-4297-9488-8e0b59774844","Type":"ContainerDied","Data":"5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8"} Dec 10 12:48:44 crc kubenswrapper[4852]: I1210 12:48:44.176764 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:48:44 crc kubenswrapper[4852]: E1210 12:48:44.177124 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:48:45 crc kubenswrapper[4852]: I1210 12:48:45.514897 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vltkv" event={"ID":"5e3c1e36-6ba2-4297-9488-8e0b59774844","Type":"ContainerStarted","Data":"2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef"} Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.542051 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vltkv" podStartSLOduration=5.790226612 podStartE2EDuration="8.542032367s" podCreationTimestamp="2025-12-10 12:48:39 +0000 UTC" firstStartedPulling="2025-12-10 12:48:41.456403311 +0000 UTC m=+3407.541928535" lastFinishedPulling="2025-12-10 12:48:44.208209066 +0000 UTC m=+3410.293734290" observedRunningTime="2025-12-10 12:48:45.550343288 +0000 UTC m=+3411.635868522" watchObservedRunningTime="2025-12-10 12:48:47.542032367 +0000 UTC m=+3413.627557591" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.548331 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hsj9"] Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.550470 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.562732 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hsj9"] Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.587952 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-catalog-content\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.588662 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdt85\" (UniqueName: \"kubernetes.io/projected/84b72172-7c67-47a3-bba5-23f2f158a602-kube-api-access-mdt85\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.588762 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-utilities\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.690774 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdt85\" (UniqueName: \"kubernetes.io/projected/84b72172-7c67-47a3-bba5-23f2f158a602-kube-api-access-mdt85\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.690842 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-utilities\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.690902 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-catalog-content\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.691702 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-utilities\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.691756 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-catalog-content\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.719432 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdt85\" (UniqueName: \"kubernetes.io/projected/84b72172-7c67-47a3-bba5-23f2f158a602-kube-api-access-mdt85\") pod \"certified-operators-2hsj9\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:47 crc kubenswrapper[4852]: I1210 12:48:47.876978 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:48 crc kubenswrapper[4852]: I1210 12:48:48.346036 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hsj9"] Dec 10 12:48:48 crc kubenswrapper[4852]: I1210 12:48:48.540870 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hsj9" event={"ID":"84b72172-7c67-47a3-bba5-23f2f158a602","Type":"ContainerStarted","Data":"203aaab02141512508265e91b9e9110946219aa5dcc9f970d833fdc8e29c70d7"} Dec 10 12:48:49 crc kubenswrapper[4852]: I1210 12:48:49.504901 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:49 crc kubenswrapper[4852]: I1210 12:48:49.505386 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:49 crc kubenswrapper[4852]: I1210 12:48:49.550851 4852 generic.go:334] "Generic (PLEG): container finished" podID="84b72172-7c67-47a3-bba5-23f2f158a602" containerID="3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878" exitCode=0 Dec 10 12:48:49 crc kubenswrapper[4852]: I1210 12:48:49.550896 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hsj9" event={"ID":"84b72172-7c67-47a3-bba5-23f2f158a602","Type":"ContainerDied","Data":"3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878"} Dec 10 12:48:49 crc kubenswrapper[4852]: I1210 12:48:49.581547 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:49 crc kubenswrapper[4852]: I1210 12:48:49.632356 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:51 crc kubenswrapper[4852]: I1210 12:48:51.586512 4852 generic.go:334] "Generic (PLEG): container finished" podID="84b72172-7c67-47a3-bba5-23f2f158a602" containerID="f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9" exitCode=0 Dec 10 12:48:51 crc kubenswrapper[4852]: I1210 12:48:51.586558 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hsj9" event={"ID":"84b72172-7c67-47a3-bba5-23f2f158a602","Type":"ContainerDied","Data":"f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9"} Dec 10 12:48:51 crc kubenswrapper[4852]: I1210 12:48:51.941070 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vltkv"] Dec 10 12:48:51 crc kubenswrapper[4852]: I1210 12:48:51.941323 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vltkv" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerName="registry-server" containerID="cri-o://2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef" gracePeriod=2 Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.485673 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.586109 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvzw9\" (UniqueName: \"kubernetes.io/projected/5e3c1e36-6ba2-4297-9488-8e0b59774844-kube-api-access-gvzw9\") pod \"5e3c1e36-6ba2-4297-9488-8e0b59774844\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.586470 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-utilities\") pod \"5e3c1e36-6ba2-4297-9488-8e0b59774844\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.586584 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-catalog-content\") pod \"5e3c1e36-6ba2-4297-9488-8e0b59774844\" (UID: \"5e3c1e36-6ba2-4297-9488-8e0b59774844\") " Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.587494 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-utilities" (OuterVolumeSpecName: "utilities") pod "5e3c1e36-6ba2-4297-9488-8e0b59774844" (UID: "5e3c1e36-6ba2-4297-9488-8e0b59774844"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.597874 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3c1e36-6ba2-4297-9488-8e0b59774844-kube-api-access-gvzw9" (OuterVolumeSpecName: "kube-api-access-gvzw9") pod "5e3c1e36-6ba2-4297-9488-8e0b59774844" (UID: "5e3c1e36-6ba2-4297-9488-8e0b59774844"). InnerVolumeSpecName "kube-api-access-gvzw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.620116 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e3c1e36-6ba2-4297-9488-8e0b59774844" (UID: "5e3c1e36-6ba2-4297-9488-8e0b59774844"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.636047 4852 generic.go:334] "Generic (PLEG): container finished" podID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerID="2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef" exitCode=0 Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.636113 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vltkv" event={"ID":"5e3c1e36-6ba2-4297-9488-8e0b59774844","Type":"ContainerDied","Data":"2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef"} Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.636148 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vltkv" event={"ID":"5e3c1e36-6ba2-4297-9488-8e0b59774844","Type":"ContainerDied","Data":"9ac399a8c284358732b6b9509f5dc562214272f010f56c57ddada26dfd49e9f5"} Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.636183 4852 scope.go:117] "RemoveContainer" containerID="2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.636520 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vltkv" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.674796 4852 scope.go:117] "RemoveContainer" containerID="5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.676096 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vltkv"] Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.683753 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vltkv"] Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.688483 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.688507 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvzw9\" (UniqueName: \"kubernetes.io/projected/5e3c1e36-6ba2-4297-9488-8e0b59774844-kube-api-access-gvzw9\") on node \"crc\" DevicePath \"\"" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.688519 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e3c1e36-6ba2-4297-9488-8e0b59774844-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.696214 4852 scope.go:117] "RemoveContainer" containerID="5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.722089 4852 scope.go:117] "RemoveContainer" containerID="2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef" Dec 10 12:48:52 crc kubenswrapper[4852]: E1210 12:48:52.723881 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef\": container with ID starting with 2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef not found: ID does not exist" containerID="2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.723948 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef"} err="failed to get container status \"2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef\": rpc error: code = NotFound desc = could not find container \"2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef\": container with ID starting with 2d9436d86d35efa88acec13f8ddb2c175c6141943ba4a3306a6b7ce1176317ef not found: ID does not exist" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.723990 4852 scope.go:117] "RemoveContainer" containerID="5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8" Dec 10 12:48:52 crc kubenswrapper[4852]: E1210 12:48:52.724430 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8\": container with ID starting with 5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8 not found: ID does not exist" containerID="5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.724483 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8"} err="failed to get container status \"5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8\": rpc error: code = NotFound desc = could not find container \"5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8\": container with ID starting with 5185294065be1f55617d7d41e48457c3c74e2084f7dbaefa4df74eb3747638f8 not found: ID does not exist" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.724524 4852 scope.go:117] "RemoveContainer" containerID="5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8" Dec 10 12:48:52 crc kubenswrapper[4852]: E1210 12:48:52.724950 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8\": container with ID starting with 5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8 not found: ID does not exist" containerID="5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8" Dec 10 12:48:52 crc kubenswrapper[4852]: I1210 12:48:52.724997 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8"} err="failed to get container status \"5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8\": rpc error: code = NotFound desc = could not find container \"5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8\": container with ID starting with 5b47f34d21d9c7bc9ebeb90a4020f170ae4a38f58ba359c69068b4253eb655a8 not found: ID does not exist" Dec 10 12:48:53 crc kubenswrapper[4852]: I1210 12:48:53.649308 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hsj9" event={"ID":"84b72172-7c67-47a3-bba5-23f2f158a602","Type":"ContainerStarted","Data":"aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f"} Dec 10 12:48:53 crc kubenswrapper[4852]: I1210 12:48:53.668178 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hsj9" podStartSLOduration=3.8960916450000003 podStartE2EDuration="6.668155208s" podCreationTimestamp="2025-12-10 12:48:47 +0000 UTC" firstStartedPulling="2025-12-10 12:48:49.554022665 +0000 UTC m=+3415.639547889" lastFinishedPulling="2025-12-10 12:48:52.326086208 +0000 UTC m=+3418.411611452" observedRunningTime="2025-12-10 12:48:53.665338797 +0000 UTC m=+3419.750864031" watchObservedRunningTime="2025-12-10 12:48:53.668155208 +0000 UTC m=+3419.753680432" Dec 10 12:48:54 crc kubenswrapper[4852]: I1210 12:48:54.188838 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" path="/var/lib/kubelet/pods/5e3c1e36-6ba2-4297-9488-8e0b59774844/volumes" Dec 10 12:48:55 crc kubenswrapper[4852]: I1210 12:48:55.170006 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:48:55 crc kubenswrapper[4852]: E1210 12:48:55.170424 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:48:57 crc kubenswrapper[4852]: I1210 12:48:57.877899 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:57 crc kubenswrapper[4852]: I1210 12:48:57.878210 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:57 crc kubenswrapper[4852]: I1210 12:48:57.926928 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:58 crc kubenswrapper[4852]: I1210 12:48:58.757941 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:48:58 crc kubenswrapper[4852]: I1210 12:48:58.811748 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hsj9"] Dec 10 12:49:00 crc kubenswrapper[4852]: I1210 12:49:00.716068 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hsj9" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" containerName="registry-server" containerID="cri-o://aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f" gracePeriod=2 Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.217373 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.351539 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-utilities\") pod \"84b72172-7c67-47a3-bba5-23f2f158a602\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.352196 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdt85\" (UniqueName: \"kubernetes.io/projected/84b72172-7c67-47a3-bba5-23f2f158a602-kube-api-access-mdt85\") pod \"84b72172-7c67-47a3-bba5-23f2f158a602\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.352307 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-catalog-content\") pod \"84b72172-7c67-47a3-bba5-23f2f158a602\" (UID: \"84b72172-7c67-47a3-bba5-23f2f158a602\") " Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.353245 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-utilities" (OuterVolumeSpecName: "utilities") pod "84b72172-7c67-47a3-bba5-23f2f158a602" (UID: "84b72172-7c67-47a3-bba5-23f2f158a602"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.359089 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b72172-7c67-47a3-bba5-23f2f158a602-kube-api-access-mdt85" (OuterVolumeSpecName: "kube-api-access-mdt85") pod "84b72172-7c67-47a3-bba5-23f2f158a602" (UID: "84b72172-7c67-47a3-bba5-23f2f158a602"). InnerVolumeSpecName "kube-api-access-mdt85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.404744 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84b72172-7c67-47a3-bba5-23f2f158a602" (UID: "84b72172-7c67-47a3-bba5-23f2f158a602"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.455577 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdt85\" (UniqueName: \"kubernetes.io/projected/84b72172-7c67-47a3-bba5-23f2f158a602-kube-api-access-mdt85\") on node \"crc\" DevicePath \"\"" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.455921 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.456049 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b72172-7c67-47a3-bba5-23f2f158a602-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.729141 4852 generic.go:334] "Generic (PLEG): container finished" podID="84b72172-7c67-47a3-bba5-23f2f158a602" containerID="aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f" exitCode=0 Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.731555 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hsj9" event={"ID":"84b72172-7c67-47a3-bba5-23f2f158a602","Type":"ContainerDied","Data":"aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f"} Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.731657 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hsj9" event={"ID":"84b72172-7c67-47a3-bba5-23f2f158a602","Type":"ContainerDied","Data":"203aaab02141512508265e91b9e9110946219aa5dcc9f970d833fdc8e29c70d7"} Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.731727 4852 scope.go:117] "RemoveContainer" containerID="aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.731998 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hsj9" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.769006 4852 scope.go:117] "RemoveContainer" containerID="f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.773702 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hsj9"] Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.782382 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hsj9"] Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.809410 4852 scope.go:117] "RemoveContainer" containerID="3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.848908 4852 scope.go:117] "RemoveContainer" containerID="aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f" Dec 10 12:49:01 crc kubenswrapper[4852]: E1210 12:49:01.851094 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f\": container with ID starting with aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f not found: ID does not exist" containerID="aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.851143 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f"} err="failed to get container status \"aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f\": rpc error: code = NotFound desc = could not find container \"aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f\": container with ID starting with aa350fc30a2c462b1e82ed1cc517442d5aa71dbc9b7766e8831d70785b9aec5f not found: ID does not exist" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.851173 4852 scope.go:117] "RemoveContainer" containerID="f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9" Dec 10 12:49:01 crc kubenswrapper[4852]: E1210 12:49:01.851519 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9\": container with ID starting with f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9 not found: ID does not exist" containerID="f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.851550 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9"} err="failed to get container status \"f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9\": rpc error: code = NotFound desc = could not find container \"f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9\": container with ID starting with f1762cf39a909b23abb5a79ffeaa5c258b3e7037ed3e2b200b942f4574351ab9 not found: ID does not exist" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.851568 4852 scope.go:117] "RemoveContainer" containerID="3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878" Dec 10 12:49:01 crc kubenswrapper[4852]: E1210 12:49:01.851886 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878\": container with ID starting with 3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878 not found: ID does not exist" containerID="3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878" Dec 10 12:49:01 crc kubenswrapper[4852]: I1210 12:49:01.851914 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878"} err="failed to get container status \"3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878\": rpc error: code = NotFound desc = could not find container \"3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878\": container with ID starting with 3f2c5a3f5e43d1a70fc5c115a9d149d3834027bb9767f19a804937d2cae3f878 not found: ID does not exist" Dec 10 12:49:02 crc kubenswrapper[4852]: I1210 12:49:02.181076 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" path="/var/lib/kubelet/pods/84b72172-7c67-47a3-bba5-23f2f158a602/volumes" Dec 10 12:49:09 crc kubenswrapper[4852]: I1210 12:49:09.169394 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:49:09 crc kubenswrapper[4852]: E1210 12:49:09.170147 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:49:21 crc kubenswrapper[4852]: I1210 12:49:21.170920 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:49:21 crc kubenswrapper[4852]: I1210 12:49:21.921749 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"0044a3521602efa65af34e1df28854cb71feb8843ed5094eb0ccedc5ceaf9721"} Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.097856 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dfwrb"] Dec 10 12:49:54 crc kubenswrapper[4852]: E1210 12:49:54.099195 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" containerName="extract-utilities" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.099217 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" containerName="extract-utilities" Dec 10 12:49:54 crc kubenswrapper[4852]: E1210 12:49:54.099250 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" containerName="registry-server" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.099259 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" containerName="registry-server" Dec 10 12:49:54 crc kubenswrapper[4852]: E1210 12:49:54.099288 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerName="extract-utilities" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.099296 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerName="extract-utilities" Dec 10 12:49:54 crc kubenswrapper[4852]: E1210 12:49:54.099312 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" containerName="extract-content" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.099319 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" containerName="extract-content" Dec 10 12:49:54 crc kubenswrapper[4852]: E1210 12:49:54.099332 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerName="registry-server" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.099340 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerName="registry-server" Dec 10 12:49:54 crc kubenswrapper[4852]: E1210 12:49:54.099350 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerName="extract-content" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.099358 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerName="extract-content" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.099632 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b72172-7c67-47a3-bba5-23f2f158a602" containerName="registry-server" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.099648 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e3c1e36-6ba2-4297-9488-8e0b59774844" containerName="registry-server" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.101264 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.110839 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfwrb"] Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.190669 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-catalog-content\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.190889 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-utilities\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.191311 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfwg8\" (UniqueName: \"kubernetes.io/projected/496f60a3-a543-40d2-95bc-0ff95856e366-kube-api-access-mfwg8\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.300418 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-utilities\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.300677 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfwg8\" (UniqueName: \"kubernetes.io/projected/496f60a3-a543-40d2-95bc-0ff95856e366-kube-api-access-mfwg8\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.300870 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-catalog-content\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.301109 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-utilities\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.301613 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-catalog-content\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.331404 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfwg8\" (UniqueName: \"kubernetes.io/projected/496f60a3-a543-40d2-95bc-0ff95856e366-kube-api-access-mfwg8\") pod \"community-operators-dfwrb\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.439270 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:49:54 crc kubenswrapper[4852]: I1210 12:49:54.966983 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfwrb"] Dec 10 12:49:55 crc kubenswrapper[4852]: I1210 12:49:55.340726 4852 generic.go:334] "Generic (PLEG): container finished" podID="496f60a3-a543-40d2-95bc-0ff95856e366" containerID="08bf3903027aec46c26357813b9a26d14f9a3973c7fc4ea7ad0405941cb9cad5" exitCode=0 Dec 10 12:49:55 crc kubenswrapper[4852]: I1210 12:49:55.340784 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfwrb" event={"ID":"496f60a3-a543-40d2-95bc-0ff95856e366","Type":"ContainerDied","Data":"08bf3903027aec46c26357813b9a26d14f9a3973c7fc4ea7ad0405941cb9cad5"} Dec 10 12:49:55 crc kubenswrapper[4852]: I1210 12:49:55.340816 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfwrb" event={"ID":"496f60a3-a543-40d2-95bc-0ff95856e366","Type":"ContainerStarted","Data":"0ff35338c64920bba19fae620d8427d024f0edffbb59ec8ea5499079ee8ebb88"} Dec 10 12:49:56 crc kubenswrapper[4852]: I1210 12:49:56.356611 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfwrb" event={"ID":"496f60a3-a543-40d2-95bc-0ff95856e366","Type":"ContainerStarted","Data":"d9f6bb6ec738eb2a8cfbb48d2ee36830a171052d50d389b2cfd716704b00f436"} Dec 10 12:49:57 crc kubenswrapper[4852]: I1210 12:49:57.366119 4852 generic.go:334] "Generic (PLEG): container finished" podID="496f60a3-a543-40d2-95bc-0ff95856e366" containerID="d9f6bb6ec738eb2a8cfbb48d2ee36830a171052d50d389b2cfd716704b00f436" exitCode=0 Dec 10 12:49:57 crc kubenswrapper[4852]: I1210 12:49:57.366183 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfwrb" event={"ID":"496f60a3-a543-40d2-95bc-0ff95856e366","Type":"ContainerDied","Data":"d9f6bb6ec738eb2a8cfbb48d2ee36830a171052d50d389b2cfd716704b00f436"} Dec 10 12:49:58 crc kubenswrapper[4852]: I1210 12:49:58.390906 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfwrb" event={"ID":"496f60a3-a543-40d2-95bc-0ff95856e366","Type":"ContainerStarted","Data":"b15546fbda456e2a1d8d9d69c22e8f99272523c34e25aa3e7675e4c94469370d"} Dec 10 12:49:58 crc kubenswrapper[4852]: I1210 12:49:58.420098 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dfwrb" podStartSLOduration=1.845133273 podStartE2EDuration="4.420080678s" podCreationTimestamp="2025-12-10 12:49:54 +0000 UTC" firstStartedPulling="2025-12-10 12:49:55.344870468 +0000 UTC m=+3481.430395692" lastFinishedPulling="2025-12-10 12:49:57.919817873 +0000 UTC m=+3484.005343097" observedRunningTime="2025-12-10 12:49:58.417144174 +0000 UTC m=+3484.502669398" watchObservedRunningTime="2025-12-10 12:49:58.420080678 +0000 UTC m=+3484.505605902" Dec 10 12:50:04 crc kubenswrapper[4852]: I1210 12:50:04.440449 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:50:04 crc kubenswrapper[4852]: I1210 12:50:04.441144 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:50:04 crc kubenswrapper[4852]: I1210 12:50:04.493844 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:50:04 crc kubenswrapper[4852]: I1210 12:50:04.543821 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:50:04 crc kubenswrapper[4852]: I1210 12:50:04.730464 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dfwrb"] Dec 10 12:50:06 crc kubenswrapper[4852]: I1210 12:50:06.487412 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dfwrb" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" containerName="registry-server" containerID="cri-o://b15546fbda456e2a1d8d9d69c22e8f99272523c34e25aa3e7675e4c94469370d" gracePeriod=2 Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.498465 4852 generic.go:334] "Generic (PLEG): container finished" podID="496f60a3-a543-40d2-95bc-0ff95856e366" containerID="b15546fbda456e2a1d8d9d69c22e8f99272523c34e25aa3e7675e4c94469370d" exitCode=0 Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.498569 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfwrb" event={"ID":"496f60a3-a543-40d2-95bc-0ff95856e366","Type":"ContainerDied","Data":"b15546fbda456e2a1d8d9d69c22e8f99272523c34e25aa3e7675e4c94469370d"} Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.702131 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.873527 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-utilities\") pod \"496f60a3-a543-40d2-95bc-0ff95856e366\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.873702 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfwg8\" (UniqueName: \"kubernetes.io/projected/496f60a3-a543-40d2-95bc-0ff95856e366-kube-api-access-mfwg8\") pod \"496f60a3-a543-40d2-95bc-0ff95856e366\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.873773 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-catalog-content\") pod \"496f60a3-a543-40d2-95bc-0ff95856e366\" (UID: \"496f60a3-a543-40d2-95bc-0ff95856e366\") " Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.874825 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-utilities" (OuterVolumeSpecName: "utilities") pod "496f60a3-a543-40d2-95bc-0ff95856e366" (UID: "496f60a3-a543-40d2-95bc-0ff95856e366"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.879756 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496f60a3-a543-40d2-95bc-0ff95856e366-kube-api-access-mfwg8" (OuterVolumeSpecName: "kube-api-access-mfwg8") pod "496f60a3-a543-40d2-95bc-0ff95856e366" (UID: "496f60a3-a543-40d2-95bc-0ff95856e366"). InnerVolumeSpecName "kube-api-access-mfwg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.919799 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "496f60a3-a543-40d2-95bc-0ff95856e366" (UID: "496f60a3-a543-40d2-95bc-0ff95856e366"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.976061 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.976104 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfwg8\" (UniqueName: \"kubernetes.io/projected/496f60a3-a543-40d2-95bc-0ff95856e366-kube-api-access-mfwg8\") on node \"crc\" DevicePath \"\"" Dec 10 12:50:07 crc kubenswrapper[4852]: I1210 12:50:07.976118 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/496f60a3-a543-40d2-95bc-0ff95856e366-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:50:08 crc kubenswrapper[4852]: I1210 12:50:08.509853 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfwrb" event={"ID":"496f60a3-a543-40d2-95bc-0ff95856e366","Type":"ContainerDied","Data":"0ff35338c64920bba19fae620d8427d024f0edffbb59ec8ea5499079ee8ebb88"} Dec 10 12:50:08 crc kubenswrapper[4852]: I1210 12:50:08.509907 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfwrb" Dec 10 12:50:08 crc kubenswrapper[4852]: I1210 12:50:08.511023 4852 scope.go:117] "RemoveContainer" containerID="b15546fbda456e2a1d8d9d69c22e8f99272523c34e25aa3e7675e4c94469370d" Dec 10 12:50:08 crc kubenswrapper[4852]: I1210 12:50:08.534387 4852 scope.go:117] "RemoveContainer" containerID="d9f6bb6ec738eb2a8cfbb48d2ee36830a171052d50d389b2cfd716704b00f436" Dec 10 12:50:08 crc kubenswrapper[4852]: I1210 12:50:08.537929 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dfwrb"] Dec 10 12:50:08 crc kubenswrapper[4852]: I1210 12:50:08.547274 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dfwrb"] Dec 10 12:50:08 crc kubenswrapper[4852]: I1210 12:50:08.564561 4852 scope.go:117] "RemoveContainer" containerID="08bf3903027aec46c26357813b9a26d14f9a3973c7fc4ea7ad0405941cb9cad5" Dec 10 12:50:10 crc kubenswrapper[4852]: I1210 12:50:10.180732 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" path="/var/lib/kubelet/pods/496f60a3-a543-40d2-95bc-0ff95856e366/volumes" Dec 10 12:51:45 crc kubenswrapper[4852]: I1210 12:51:45.790384 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:51:45 crc kubenswrapper[4852]: I1210 12:51:45.790912 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:52:15 crc kubenswrapper[4852]: I1210 12:52:15.789973 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:52:15 crc kubenswrapper[4852]: I1210 12:52:15.790636 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:52:45 crc kubenswrapper[4852]: I1210 12:52:45.790406 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:52:45 crc kubenswrapper[4852]: I1210 12:52:45.791049 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:52:45 crc kubenswrapper[4852]: I1210 12:52:45.791110 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:52:45 crc kubenswrapper[4852]: I1210 12:52:45.792116 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0044a3521602efa65af34e1df28854cb71feb8843ed5094eb0ccedc5ceaf9721"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:52:45 crc kubenswrapper[4852]: I1210 12:52:45.792400 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://0044a3521602efa65af34e1df28854cb71feb8843ed5094eb0ccedc5ceaf9721" gracePeriod=600 Dec 10 12:52:45 crc kubenswrapper[4852]: I1210 12:52:45.996581 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="0044a3521602efa65af34e1df28854cb71feb8843ed5094eb0ccedc5ceaf9721" exitCode=0 Dec 10 12:52:45 crc kubenswrapper[4852]: I1210 12:52:45.996625 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"0044a3521602efa65af34e1df28854cb71feb8843ed5094eb0ccedc5ceaf9721"} Dec 10 12:52:45 crc kubenswrapper[4852]: I1210 12:52:45.996656 4852 scope.go:117] "RemoveContainer" containerID="86d67a174005e4200997724b8a83ed2ab8f3f3cc46e3f0bd90c9e9ef7917509f" Dec 10 12:52:47 crc kubenswrapper[4852]: I1210 12:52:47.010708 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142"} Dec 10 12:54:11 crc kubenswrapper[4852]: I1210 12:54:11.047103 4852 generic.go:334] "Generic (PLEG): container finished" podID="9238ddbd-fcdf-4612-974f-114508e02356" containerID="977fa0b6f0594996e73a8e8838e11a27c80263f27209fde0a338000fef367fd9" exitCode=0 Dec 10 12:54:11 crc kubenswrapper[4852]: I1210 12:54:11.047625 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9238ddbd-fcdf-4612-974f-114508e02356","Type":"ContainerDied","Data":"977fa0b6f0594996e73a8e8838e11a27c80263f27209fde0a338000fef367fd9"} Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.453631 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520336 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520402 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-config-data\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520437 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config-secret\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520489 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ssh-key\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520591 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-temporary\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520638 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-workdir\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520678 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ca-certs\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520732 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.520772 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptrn5\" (UniqueName: \"kubernetes.io/projected/9238ddbd-fcdf-4612-974f-114508e02356-kube-api-access-ptrn5\") pod \"9238ddbd-fcdf-4612-974f-114508e02356\" (UID: \"9238ddbd-fcdf-4612-974f-114508e02356\") " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.521345 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-config-data" (OuterVolumeSpecName: "config-data") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.521712 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.526769 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9238ddbd-fcdf-4612-974f-114508e02356-kube-api-access-ptrn5" (OuterVolumeSpecName: "kube-api-access-ptrn5") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "kube-api-access-ptrn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.526766 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.527754 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.553712 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.554853 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.557192 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.571427 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9238ddbd-fcdf-4612-974f-114508e02356" (UID: "9238ddbd-fcdf-4612-974f-114508e02356"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623085 4852 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623152 4852 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623164 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptrn5\" (UniqueName: \"kubernetes.io/projected/9238ddbd-fcdf-4612-974f-114508e02356-kube-api-access-ptrn5\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623175 4852 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623184 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9238ddbd-fcdf-4612-974f-114508e02356-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623192 4852 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623202 4852 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9238ddbd-fcdf-4612-974f-114508e02356-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623264 4852 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.623275 4852 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9238ddbd-fcdf-4612-974f-114508e02356-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.648386 4852 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 10 12:54:12 crc kubenswrapper[4852]: I1210 12:54:12.725194 4852 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 10 12:54:13 crc kubenswrapper[4852]: I1210 12:54:13.071919 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9238ddbd-fcdf-4612-974f-114508e02356","Type":"ContainerDied","Data":"0b3e7971d7c33f7a34aef85ddb1dc8dc5a5ffd15711c7bd4caae8e47b9522c69"} Dec 10 12:54:13 crc kubenswrapper[4852]: I1210 12:54:13.072512 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b3e7971d7c33f7a34aef85ddb1dc8dc5a5ffd15711c7bd4caae8e47b9522c69" Dec 10 12:54:13 crc kubenswrapper[4852]: I1210 12:54:13.072072 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.516441 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 10 12:54:19 crc kubenswrapper[4852]: E1210 12:54:19.518000 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" containerName="extract-content" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.518017 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" containerName="extract-content" Dec 10 12:54:19 crc kubenswrapper[4852]: E1210 12:54:19.518034 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" containerName="registry-server" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.518043 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" containerName="registry-server" Dec 10 12:54:19 crc kubenswrapper[4852]: E1210 12:54:19.518061 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9238ddbd-fcdf-4612-974f-114508e02356" containerName="tempest-tests-tempest-tests-runner" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.518069 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="9238ddbd-fcdf-4612-974f-114508e02356" containerName="tempest-tests-tempest-tests-runner" Dec 10 12:54:19 crc kubenswrapper[4852]: E1210 12:54:19.518098 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" containerName="extract-utilities" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.518106 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" containerName="extract-utilities" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.518392 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="9238ddbd-fcdf-4612-974f-114508e02356" containerName="tempest-tests-tempest-tests-runner" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.518446 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="496f60a3-a543-40d2-95bc-0ff95856e366" containerName="registry-server" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.519415 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.522676 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-t56dj" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.524673 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.666456 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br6tw\" (UniqueName: \"kubernetes.io/projected/ce2aca81-09ab-4dd7-b9b7-d35cef864a73-kube-api-access-br6tw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce2aca81-09ab-4dd7-b9b7-d35cef864a73\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.666889 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce2aca81-09ab-4dd7-b9b7-d35cef864a73\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.768706 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br6tw\" (UniqueName: \"kubernetes.io/projected/ce2aca81-09ab-4dd7-b9b7-d35cef864a73-kube-api-access-br6tw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce2aca81-09ab-4dd7-b9b7-d35cef864a73\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.769128 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce2aca81-09ab-4dd7-b9b7-d35cef864a73\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.769654 4852 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce2aca81-09ab-4dd7-b9b7-d35cef864a73\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.813747 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br6tw\" (UniqueName: \"kubernetes.io/projected/ce2aca81-09ab-4dd7-b9b7-d35cef864a73-kube-api-access-br6tw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce2aca81-09ab-4dd7-b9b7-d35cef864a73\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.827503 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ce2aca81-09ab-4dd7-b9b7-d35cef864a73\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:19 crc kubenswrapper[4852]: I1210 12:54:19.881204 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 10 12:54:20 crc kubenswrapper[4852]: I1210 12:54:20.314852 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 10 12:54:20 crc kubenswrapper[4852]: I1210 12:54:20.319990 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:54:21 crc kubenswrapper[4852]: I1210 12:54:21.171629 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ce2aca81-09ab-4dd7-b9b7-d35cef864a73","Type":"ContainerStarted","Data":"9bf9ce9ab355ae6dd3c9752260b28793fd40fd3f3ac8fbabd3b678cdbdd8defc"} Dec 10 12:54:22 crc kubenswrapper[4852]: I1210 12:54:22.183207 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ce2aca81-09ab-4dd7-b9b7-d35cef864a73","Type":"ContainerStarted","Data":"994de54afdaf0bbffb8de1feeba4de8ecfb1958371df9c23294b12d2f92d04f3"} Dec 10 12:54:22 crc kubenswrapper[4852]: I1210 12:54:22.204829 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.071226705 podStartE2EDuration="3.204809196s" podCreationTimestamp="2025-12-10 12:54:19 +0000 UTC" firstStartedPulling="2025-12-10 12:54:20.319734273 +0000 UTC m=+3746.405259507" lastFinishedPulling="2025-12-10 12:54:21.453316774 +0000 UTC m=+3747.538841998" observedRunningTime="2025-12-10 12:54:22.202102067 +0000 UTC m=+3748.287627301" watchObservedRunningTime="2025-12-10 12:54:22.204809196 +0000 UTC m=+3748.290334420" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.428195 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-24l7j/must-gather-h899f"] Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.430125 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.434683 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-24l7j"/"kube-root-ca.crt" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.434957 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-24l7j"/"openshift-service-ca.crt" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.436183 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-24l7j"/"default-dockercfg-xhb52" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.445591 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-24l7j/must-gather-h899f"] Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.516810 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68j8h\" (UniqueName: \"kubernetes.io/projected/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-kube-api-access-68j8h\") pod \"must-gather-h899f\" (UID: \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\") " pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.516863 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-must-gather-output\") pod \"must-gather-h899f\" (UID: \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\") " pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.618985 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68j8h\" (UniqueName: \"kubernetes.io/projected/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-kube-api-access-68j8h\") pod \"must-gather-h899f\" (UID: \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\") " pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.619031 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-must-gather-output\") pod \"must-gather-h899f\" (UID: \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\") " pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.619468 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-must-gather-output\") pod \"must-gather-h899f\" (UID: \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\") " pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.643174 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68j8h\" (UniqueName: \"kubernetes.io/projected/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-kube-api-access-68j8h\") pod \"must-gather-h899f\" (UID: \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\") " pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 12:54:44 crc kubenswrapper[4852]: I1210 12:54:44.752448 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 12:54:45 crc kubenswrapper[4852]: I1210 12:54:45.320758 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-24l7j/must-gather-h899f"] Dec 10 12:54:45 crc kubenswrapper[4852]: W1210 12:54:45.325706 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86f75cc3_6c1f_4a3a_8c8e_5a3ede0d5d6f.slice/crio-856aec160fd8bc2d198357ca2e63b2c94c008f4a4af5fc003713e8f2b7a77c2e WatchSource:0}: Error finding container 856aec160fd8bc2d198357ca2e63b2c94c008f4a4af5fc003713e8f2b7a77c2e: Status 404 returned error can't find the container with id 856aec160fd8bc2d198357ca2e63b2c94c008f4a4af5fc003713e8f2b7a77c2e Dec 10 12:54:45 crc kubenswrapper[4852]: I1210 12:54:45.392323 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/must-gather-h899f" event={"ID":"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f","Type":"ContainerStarted","Data":"856aec160fd8bc2d198357ca2e63b2c94c008f4a4af5fc003713e8f2b7a77c2e"} Dec 10 12:54:52 crc kubenswrapper[4852]: I1210 12:54:52.466315 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/must-gather-h899f" event={"ID":"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f","Type":"ContainerStarted","Data":"d8d3fc60cfaabdc7b1ccf9642e47aa5c5ab68e2f66fa10ecaeb1859856f51960"} Dec 10 12:54:53 crc kubenswrapper[4852]: I1210 12:54:53.478724 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/must-gather-h899f" event={"ID":"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f","Type":"ContainerStarted","Data":"f1350aad3f798dd717c06816140a8c8d3a9c7df8a61aa6e74a8084c2714e0662"} Dec 10 12:54:53 crc kubenswrapper[4852]: I1210 12:54:53.503644 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-24l7j/must-gather-h899f" podStartSLOduration=2.78858464 podStartE2EDuration="9.503621287s" podCreationTimestamp="2025-12-10 12:54:44 +0000 UTC" firstStartedPulling="2025-12-10 12:54:45.328126764 +0000 UTC m=+3771.413651988" lastFinishedPulling="2025-12-10 12:54:52.043163401 +0000 UTC m=+3778.128688635" observedRunningTime="2025-12-10 12:54:53.501875192 +0000 UTC m=+3779.587400426" watchObservedRunningTime="2025-12-10 12:54:53.503621287 +0000 UTC m=+3779.589146521" Dec 10 12:54:55 crc kubenswrapper[4852]: I1210 12:54:55.896998 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-24l7j/crc-debug-njd8j"] Dec 10 12:54:55 crc kubenswrapper[4852]: I1210 12:54:55.899355 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:54:56 crc kubenswrapper[4852]: I1210 12:54:56.061788 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-host\") pod \"crc-debug-njd8j\" (UID: \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\") " pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:54:56 crc kubenswrapper[4852]: I1210 12:54:56.061871 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clngm\" (UniqueName: \"kubernetes.io/projected/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-kube-api-access-clngm\") pod \"crc-debug-njd8j\" (UID: \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\") " pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:54:56 crc kubenswrapper[4852]: I1210 12:54:56.164185 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-host\") pod \"crc-debug-njd8j\" (UID: \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\") " pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:54:56 crc kubenswrapper[4852]: I1210 12:54:56.164645 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clngm\" (UniqueName: \"kubernetes.io/projected/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-kube-api-access-clngm\") pod \"crc-debug-njd8j\" (UID: \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\") " pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:54:56 crc kubenswrapper[4852]: I1210 12:54:56.164566 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-host\") pod \"crc-debug-njd8j\" (UID: \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\") " pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:54:56 crc kubenswrapper[4852]: I1210 12:54:56.202178 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clngm\" (UniqueName: \"kubernetes.io/projected/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-kube-api-access-clngm\") pod \"crc-debug-njd8j\" (UID: \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\") " pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:54:56 crc kubenswrapper[4852]: I1210 12:54:56.225994 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:54:56 crc kubenswrapper[4852]: I1210 12:54:56.508289 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/crc-debug-njd8j" event={"ID":"5084ad3a-beee-4d5c-b13c-f9a5e17cd791","Type":"ContainerStarted","Data":"b208ecfc22911c1cc33b2981310e96d0d1031ebda7eba04ea22dc334d98eacaa"} Dec 10 12:55:10 crc kubenswrapper[4852]: I1210 12:55:10.645034 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/crc-debug-njd8j" event={"ID":"5084ad3a-beee-4d5c-b13c-f9a5e17cd791","Type":"ContainerStarted","Data":"7d40ac8b5053c8de90405d013339d285b278a779c6c5a451044bfce4d336c4b2"} Dec 10 12:55:10 crc kubenswrapper[4852]: I1210 12:55:10.674789 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-24l7j/crc-debug-njd8j" podStartSLOduration=2.10440161 podStartE2EDuration="15.674753158s" podCreationTimestamp="2025-12-10 12:54:55 +0000 UTC" firstStartedPulling="2025-12-10 12:54:56.272771521 +0000 UTC m=+3782.358296745" lastFinishedPulling="2025-12-10 12:55:09.843123069 +0000 UTC m=+3795.928648293" observedRunningTime="2025-12-10 12:55:10.657858509 +0000 UTC m=+3796.743383733" watchObservedRunningTime="2025-12-10 12:55:10.674753158 +0000 UTC m=+3796.760278382" Dec 10 12:55:15 crc kubenswrapper[4852]: I1210 12:55:15.790633 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:55:15 crc kubenswrapper[4852]: I1210 12:55:15.791063 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:55:45 crc kubenswrapper[4852]: I1210 12:55:45.789964 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:55:45 crc kubenswrapper[4852]: I1210 12:55:45.791831 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:56:02 crc kubenswrapper[4852]: I1210 12:56:02.149360 4852 generic.go:334] "Generic (PLEG): container finished" podID="5084ad3a-beee-4d5c-b13c-f9a5e17cd791" containerID="7d40ac8b5053c8de90405d013339d285b278a779c6c5a451044bfce4d336c4b2" exitCode=0 Dec 10 12:56:02 crc kubenswrapper[4852]: I1210 12:56:02.149414 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/crc-debug-njd8j" event={"ID":"5084ad3a-beee-4d5c-b13c-f9a5e17cd791","Type":"ContainerDied","Data":"7d40ac8b5053c8de90405d013339d285b278a779c6c5a451044bfce4d336c4b2"} Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.294757 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.335372 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-24l7j/crc-debug-njd8j"] Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.343627 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-24l7j/crc-debug-njd8j"] Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.485293 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clngm\" (UniqueName: \"kubernetes.io/projected/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-kube-api-access-clngm\") pod \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\" (UID: \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\") " Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.485452 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-host\") pod \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\" (UID: \"5084ad3a-beee-4d5c-b13c-f9a5e17cd791\") " Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.485568 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-host" (OuterVolumeSpecName: "host") pod "5084ad3a-beee-4d5c-b13c-f9a5e17cd791" (UID: "5084ad3a-beee-4d5c-b13c-f9a5e17cd791"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.485989 4852 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-host\") on node \"crc\" DevicePath \"\"" Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.492264 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-kube-api-access-clngm" (OuterVolumeSpecName: "kube-api-access-clngm") pod "5084ad3a-beee-4d5c-b13c-f9a5e17cd791" (UID: "5084ad3a-beee-4d5c-b13c-f9a5e17cd791"). InnerVolumeSpecName "kube-api-access-clngm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:56:03 crc kubenswrapper[4852]: I1210 12:56:03.588407 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clngm\" (UniqueName: \"kubernetes.io/projected/5084ad3a-beee-4d5c-b13c-f9a5e17cd791-kube-api-access-clngm\") on node \"crc\" DevicePath \"\"" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.169096 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.180473 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5084ad3a-beee-4d5c-b13c-f9a5e17cd791" path="/var/lib/kubelet/pods/5084ad3a-beee-4d5c-b13c-f9a5e17cd791/volumes" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.181150 4852 scope.go:117] "RemoveContainer" containerID="7d40ac8b5053c8de90405d013339d285b278a779c6c5a451044bfce4d336c4b2" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.538830 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-24l7j/crc-debug-d7wzh"] Dec 10 12:56:04 crc kubenswrapper[4852]: E1210 12:56:04.539317 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5084ad3a-beee-4d5c-b13c-f9a5e17cd791" containerName="container-00" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.539333 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="5084ad3a-beee-4d5c-b13c-f9a5e17cd791" containerName="container-00" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.539545 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="5084ad3a-beee-4d5c-b13c-f9a5e17cd791" containerName="container-00" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.540335 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.607044 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-host\") pod \"crc-debug-d7wzh\" (UID: \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\") " pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.607292 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbj2m\" (UniqueName: \"kubernetes.io/projected/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-kube-api-access-xbj2m\") pod \"crc-debug-d7wzh\" (UID: \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\") " pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.709038 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-host\") pod \"crc-debug-d7wzh\" (UID: \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\") " pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.709197 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbj2m\" (UniqueName: \"kubernetes.io/projected/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-kube-api-access-xbj2m\") pod \"crc-debug-d7wzh\" (UID: \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\") " pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.709194 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-host\") pod \"crc-debug-d7wzh\" (UID: \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\") " pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.737626 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbj2m\" (UniqueName: \"kubernetes.io/projected/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-kube-api-access-xbj2m\") pod \"crc-debug-d7wzh\" (UID: \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\") " pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:04 crc kubenswrapper[4852]: I1210 12:56:04.857076 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:04 crc kubenswrapper[4852]: W1210 12:56:04.887507 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52ba7bab_f3d8_4cfd_a022_7dd11aa1cd2d.slice/crio-76c5798e976b01224937a2a0c925c3b6a63b8080ee43ff305d01ebbe50a13c42 WatchSource:0}: Error finding container 76c5798e976b01224937a2a0c925c3b6a63b8080ee43ff305d01ebbe50a13c42: Status 404 returned error can't find the container with id 76c5798e976b01224937a2a0c925c3b6a63b8080ee43ff305d01ebbe50a13c42 Dec 10 12:56:05 crc kubenswrapper[4852]: I1210 12:56:05.179151 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/crc-debug-d7wzh" event={"ID":"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d","Type":"ContainerStarted","Data":"fb997d16c7439940734eeba78fd13599b0477787bc191432e8a817c2bf520ed1"} Dec 10 12:56:05 crc kubenswrapper[4852]: I1210 12:56:05.179190 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/crc-debug-d7wzh" event={"ID":"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d","Type":"ContainerStarted","Data":"76c5798e976b01224937a2a0c925c3b6a63b8080ee43ff305d01ebbe50a13c42"} Dec 10 12:56:05 crc kubenswrapper[4852]: I1210 12:56:05.196632 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-24l7j/crc-debug-d7wzh" podStartSLOduration=1.196606231 podStartE2EDuration="1.196606231s" podCreationTimestamp="2025-12-10 12:56:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 12:56:05.191647895 +0000 UTC m=+3851.277173119" watchObservedRunningTime="2025-12-10 12:56:05.196606231 +0000 UTC m=+3851.282131465" Dec 10 12:56:06 crc kubenswrapper[4852]: I1210 12:56:06.189972 4852 generic.go:334] "Generic (PLEG): container finished" podID="52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d" containerID="fb997d16c7439940734eeba78fd13599b0477787bc191432e8a817c2bf520ed1" exitCode=0 Dec 10 12:56:06 crc kubenswrapper[4852]: I1210 12:56:06.190034 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/crc-debug-d7wzh" event={"ID":"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d","Type":"ContainerDied","Data":"fb997d16c7439940734eeba78fd13599b0477787bc191432e8a817c2bf520ed1"} Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.313508 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.356101 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-24l7j/crc-debug-d7wzh"] Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.362214 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-host\") pod \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\" (UID: \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\") " Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.362320 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-host" (OuterVolumeSpecName: "host") pod "52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d" (UID: "52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.362575 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbj2m\" (UniqueName: \"kubernetes.io/projected/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-kube-api-access-xbj2m\") pod \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\" (UID: \"52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d\") " Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.363047 4852 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-host\") on node \"crc\" DevicePath \"\"" Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.365090 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-24l7j/crc-debug-d7wzh"] Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.369156 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-kube-api-access-xbj2m" (OuterVolumeSpecName: "kube-api-access-xbj2m") pod "52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d" (UID: "52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d"). InnerVolumeSpecName "kube-api-access-xbj2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:56:07 crc kubenswrapper[4852]: I1210 12:56:07.465143 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbj2m\" (UniqueName: \"kubernetes.io/projected/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d-kube-api-access-xbj2m\") on node \"crc\" DevicePath \"\"" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.185453 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d" path="/var/lib/kubelet/pods/52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d/volumes" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.211587 4852 scope.go:117] "RemoveContainer" containerID="fb997d16c7439940734eeba78fd13599b0477787bc191432e8a817c2bf520ed1" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.211682 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-d7wzh" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.520977 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-24l7j/crc-debug-pqpcp"] Dec 10 12:56:08 crc kubenswrapper[4852]: E1210 12:56:08.521850 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d" containerName="container-00" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.521870 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d" containerName="container-00" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.522130 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ba7bab-f3d8-4cfd-a022-7dd11aa1cd2d" containerName="container-00" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.522953 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.584751 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptcqh\" (UniqueName: \"kubernetes.io/projected/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-kube-api-access-ptcqh\") pod \"crc-debug-pqpcp\" (UID: \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\") " pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.584836 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-host\") pod \"crc-debug-pqpcp\" (UID: \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\") " pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.686027 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-host\") pod \"crc-debug-pqpcp\" (UID: \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\") " pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.686150 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-host\") pod \"crc-debug-pqpcp\" (UID: \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\") " pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.686214 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcqh\" (UniqueName: \"kubernetes.io/projected/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-kube-api-access-ptcqh\") pod \"crc-debug-pqpcp\" (UID: \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\") " pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.707463 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcqh\" (UniqueName: \"kubernetes.io/projected/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-kube-api-access-ptcqh\") pod \"crc-debug-pqpcp\" (UID: \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\") " pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:08 crc kubenswrapper[4852]: I1210 12:56:08.847458 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:08 crc kubenswrapper[4852]: W1210 12:56:08.880112 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9bff4cf_aac4_4ddd_8028_e49dda6a2b31.slice/crio-d0e24cce2b558f6baaafe02f77508ca4b60d26d26c406ba52b7ffa6875e54b90 WatchSource:0}: Error finding container d0e24cce2b558f6baaafe02f77508ca4b60d26d26c406ba52b7ffa6875e54b90: Status 404 returned error can't find the container with id d0e24cce2b558f6baaafe02f77508ca4b60d26d26c406ba52b7ffa6875e54b90 Dec 10 12:56:09 crc kubenswrapper[4852]: I1210 12:56:09.223493 4852 generic.go:334] "Generic (PLEG): container finished" podID="b9bff4cf-aac4-4ddd-8028-e49dda6a2b31" containerID="17fd364148bb2ffb7a2b0ce74a88c18d6fade7dc5b09f1d996ed8fc49d7d2c0d" exitCode=0 Dec 10 12:56:09 crc kubenswrapper[4852]: I1210 12:56:09.223597 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/crc-debug-pqpcp" event={"ID":"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31","Type":"ContainerDied","Data":"17fd364148bb2ffb7a2b0ce74a88c18d6fade7dc5b09f1d996ed8fc49d7d2c0d"} Dec 10 12:56:09 crc kubenswrapper[4852]: I1210 12:56:09.223837 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/crc-debug-pqpcp" event={"ID":"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31","Type":"ContainerStarted","Data":"d0e24cce2b558f6baaafe02f77508ca4b60d26d26c406ba52b7ffa6875e54b90"} Dec 10 12:56:09 crc kubenswrapper[4852]: I1210 12:56:09.267036 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-24l7j/crc-debug-pqpcp"] Dec 10 12:56:09 crc kubenswrapper[4852]: I1210 12:56:09.277379 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-24l7j/crc-debug-pqpcp"] Dec 10 12:56:10 crc kubenswrapper[4852]: I1210 12:56:10.342710 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:10 crc kubenswrapper[4852]: I1210 12:56:10.420020 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-host\") pod \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\" (UID: \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\") " Dec 10 12:56:10 crc kubenswrapper[4852]: I1210 12:56:10.420080 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptcqh\" (UniqueName: \"kubernetes.io/projected/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-kube-api-access-ptcqh\") pod \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\" (UID: \"b9bff4cf-aac4-4ddd-8028-e49dda6a2b31\") " Dec 10 12:56:10 crc kubenswrapper[4852]: I1210 12:56:10.420173 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-host" (OuterVolumeSpecName: "host") pod "b9bff4cf-aac4-4ddd-8028-e49dda6a2b31" (UID: "b9bff4cf-aac4-4ddd-8028-e49dda6a2b31"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 12:56:10 crc kubenswrapper[4852]: I1210 12:56:10.420415 4852 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-host\") on node \"crc\" DevicePath \"\"" Dec 10 12:56:10 crc kubenswrapper[4852]: I1210 12:56:10.427767 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-kube-api-access-ptcqh" (OuterVolumeSpecName: "kube-api-access-ptcqh") pod "b9bff4cf-aac4-4ddd-8028-e49dda6a2b31" (UID: "b9bff4cf-aac4-4ddd-8028-e49dda6a2b31"). InnerVolumeSpecName "kube-api-access-ptcqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:56:10 crc kubenswrapper[4852]: I1210 12:56:10.521977 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptcqh\" (UniqueName: \"kubernetes.io/projected/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31-kube-api-access-ptcqh\") on node \"crc\" DevicePath \"\"" Dec 10 12:56:11 crc kubenswrapper[4852]: I1210 12:56:11.245099 4852 scope.go:117] "RemoveContainer" containerID="17fd364148bb2ffb7a2b0ce74a88c18d6fade7dc5b09f1d996ed8fc49d7d2c0d" Dec 10 12:56:11 crc kubenswrapper[4852]: I1210 12:56:11.245137 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-pqpcp" Dec 10 12:56:12 crc kubenswrapper[4852]: I1210 12:56:12.181591 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bff4cf-aac4-4ddd-8028-e49dda6a2b31" path="/var/lib/kubelet/pods/b9bff4cf-aac4-4ddd-8028-e49dda6a2b31/volumes" Dec 10 12:56:15 crc kubenswrapper[4852]: I1210 12:56:15.790194 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 12:56:15 crc kubenswrapper[4852]: I1210 12:56:15.790566 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 12:56:15 crc kubenswrapper[4852]: I1210 12:56:15.790619 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 12:56:15 crc kubenswrapper[4852]: I1210 12:56:15.791546 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 12:56:15 crc kubenswrapper[4852]: I1210 12:56:15.791608 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" gracePeriod=600 Dec 10 12:56:15 crc kubenswrapper[4852]: E1210 12:56:15.933506 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:56:16 crc kubenswrapper[4852]: I1210 12:56:16.293205 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" exitCode=0 Dec 10 12:56:16 crc kubenswrapper[4852]: I1210 12:56:16.293271 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142"} Dec 10 12:56:16 crc kubenswrapper[4852]: I1210 12:56:16.293378 4852 scope.go:117] "RemoveContainer" containerID="0044a3521602efa65af34e1df28854cb71feb8843ed5094eb0ccedc5ceaf9721" Dec 10 12:56:16 crc kubenswrapper[4852]: I1210 12:56:16.294199 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:56:16 crc kubenswrapper[4852]: E1210 12:56:16.294536 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:56:26 crc kubenswrapper[4852]: I1210 12:56:26.220418 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b6dbfcbc8-pkb6j_8c77ea6d-6206-4361-8b0f-e8f273666084/barbican-api/0.log" Dec 10 12:56:26 crc kubenswrapper[4852]: I1210 12:56:26.417616 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b6dbfcbc8-pkb6j_8c77ea6d-6206-4361-8b0f-e8f273666084/barbican-api-log/0.log" Dec 10 12:56:26 crc kubenswrapper[4852]: I1210 12:56:26.462191 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-ffd755b9d-ffwqf_67165a10-114d-48b8-9c9b-ce7525e7d98d/barbican-keystone-listener/0.log" Dec 10 12:56:26 crc kubenswrapper[4852]: I1210 12:56:26.611344 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-ffd755b9d-ffwqf_67165a10-114d-48b8-9c9b-ce7525e7d98d/barbican-keystone-listener-log/0.log" Dec 10 12:56:26 crc kubenswrapper[4852]: I1210 12:56:26.735977 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db94ccfb7-vvhtv_cc13355d-4438-440e-bfdf-debe0d6dae5b/barbican-worker/0.log" Dec 10 12:56:26 crc kubenswrapper[4852]: I1210 12:56:26.772169 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db94ccfb7-vvhtv_cc13355d-4438-440e-bfdf-debe0d6dae5b/barbican-worker-log/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.020450 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14750a4b-711e-443e-94aa-670159e43e44/ceilometer-central-agent/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.026479 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp_3ed1622b-fe84-4402-b15c-6971dde2a93f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.161066 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14750a4b-711e-443e-94aa-670159e43e44/ceilometer-notification-agent/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.169547 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:56:27 crc kubenswrapper[4852]: E1210 12:56:27.169763 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.209441 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14750a4b-711e-443e-94aa-670159e43e44/proxy-httpd/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.247302 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14750a4b-711e-443e-94aa-670159e43e44/sg-core/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.453092 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3ee70b55-95d5-4ea5-9626-a6482097668c/cinder-api-log/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.462278 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3ee70b55-95d5-4ea5-9626-a6482097668c/cinder-api/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.594042 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ff8bb370-489b-402e-a532-8dc299fa3aee/cinder-scheduler/0.log" Dec 10 12:56:27 crc kubenswrapper[4852]: I1210 12:56:27.691379 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ff8bb370-489b-402e-a532-8dc299fa3aee/probe/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.061335 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2_5d8bf94c-e162-497e-8f35-6171e96384a3/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.154985 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf_95d0bf0c-a43a-47e9-bf7e-5bdad23e513e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.272654 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-rl9rt_4e3cbf64-e31a-4f5b-a045-8a3de2cba72b/init/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.506136 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-rl9rt_4e3cbf64-e31a-4f5b-a045-8a3de2cba72b/init/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.526105 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-rl9rt_4e3cbf64-e31a-4f5b-a045-8a3de2cba72b/dnsmasq-dns/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.583882 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wrflm_abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.816014 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0edbc55b-f57a-46c0-9991-33d794c74319/glance-log/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.831522 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0edbc55b-f57a-46c0-9991-33d794c74319/glance-httpd/0.log" Dec 10 12:56:28 crc kubenswrapper[4852]: I1210 12:56:28.953075 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f7719c76-46f2-456f-8e69-8becce7f3b9c/glance-httpd/0.log" Dec 10 12:56:29 crc kubenswrapper[4852]: I1210 12:56:29.058605 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f7719c76-46f2-456f-8e69-8becce7f3b9c/glance-log/0.log" Dec 10 12:56:29 crc kubenswrapper[4852]: I1210 12:56:29.226913 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-cc955f7d4-bclr7_35b770c5-bcea-4f68-8c5b-fb852f8b97a9/horizon/0.log" Dec 10 12:56:29 crc kubenswrapper[4852]: I1210 12:56:29.460888 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nshgs_ca5421d7-d674-4ead-b580-d8c63cdffb0c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:29 crc kubenswrapper[4852]: I1210 12:56:29.518090 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-cc955f7d4-bclr7_35b770c5-bcea-4f68-8c5b-fb852f8b97a9/horizon-log/0.log" Dec 10 12:56:29 crc kubenswrapper[4852]: I1210 12:56:29.588548 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rh9qn_40acc70f-2b91-4e6e-af47-b525289badc8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:29 crc kubenswrapper[4852]: I1210 12:56:29.869638 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_20797400-1dd7-4c4b-af50-9f0c839a06c6/kube-state-metrics/0.log" Dec 10 12:56:30 crc kubenswrapper[4852]: I1210 12:56:30.249406 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd_2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:30 crc kubenswrapper[4852]: I1210 12:56:30.476725 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7dd8c6757f-lbdxp_fbe4801a-4ecc-4ecd-b00d-da9917481e2e/keystone-api/0.log" Dec 10 12:56:30 crc kubenswrapper[4852]: I1210 12:56:30.783799 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9dd466c4f-pgb9f_63687f02-3cc2-4640-88f1-e312bbe550e7/neutron-api/0.log" Dec 10 12:56:30 crc kubenswrapper[4852]: I1210 12:56:30.824069 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9dd466c4f-pgb9f_63687f02-3cc2-4640-88f1-e312bbe550e7/neutron-httpd/0.log" Dec 10 12:56:30 crc kubenswrapper[4852]: I1210 12:56:30.925990 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh_9d30136b-22e2-4932-9da4-836b2368d7bc/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:31 crc kubenswrapper[4852]: I1210 12:56:31.529354 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9cebd010-0435-40cc-9d60-2359682ee83e/nova-api-log/0.log" Dec 10 12:56:31 crc kubenswrapper[4852]: I1210 12:56:31.760925 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_70c6e25c-cf76-4ec0-9981-5a8dbc98d07e/nova-cell0-conductor-conductor/0.log" Dec 10 12:56:31 crc kubenswrapper[4852]: I1210 12:56:31.855813 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9cebd010-0435-40cc-9d60-2359682ee83e/nova-api-api/0.log" Dec 10 12:56:32 crc kubenswrapper[4852]: I1210 12:56:32.052645 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a903af04-d97a-42ba-94c4-af5d3c84de08/nova-cell1-conductor-conductor/0.log" Dec 10 12:56:32 crc kubenswrapper[4852]: I1210 12:56:32.091109 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5f6d8b73-adeb-47cd-9150-613bda06874e/nova-cell1-novncproxy-novncproxy/0.log" Dec 10 12:56:32 crc kubenswrapper[4852]: I1210 12:56:32.334845 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-bjrzh_4d0aea88-1cca-4e75-bc26-15c9f44d8682/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:32 crc kubenswrapper[4852]: I1210 12:56:32.465324 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3/nova-metadata-log/0.log" Dec 10 12:56:32 crc kubenswrapper[4852]: I1210 12:56:32.703353 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9065a2ec-b14d-4376-87f7-2305a86dec0c/nova-scheduler-scheduler/0.log" Dec 10 12:56:32 crc kubenswrapper[4852]: I1210 12:56:32.802529 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06dd4615-ecfb-4e00-9dcf-ee18317d1f95/mysql-bootstrap/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.020478 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06dd4615-ecfb-4e00-9dcf-ee18317d1f95/mysql-bootstrap/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.051691 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06dd4615-ecfb-4e00-9dcf-ee18317d1f95/galera/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.178559 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d466b79-84c0-42e9-8952-8491b4ced74e/mysql-bootstrap/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.446354 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d466b79-84c0-42e9-8952-8491b4ced74e/galera/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.447298 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d466b79-84c0-42e9-8952-8491b4ced74e/mysql-bootstrap/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.573570 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3/nova-metadata-metadata/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.673491 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_12600c57-0ba3-4781-93cc-317e533e52d8/openstackclient/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.716764 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-gkhdx_6246b317-7d73-49ff-bd8e-f4862a4584c6/ovn-controller/0.log" Dec 10 12:56:33 crc kubenswrapper[4852]: I1210 12:56:33.856513 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nspk8_2c06b796-7229-47bc-889c-4a78ef3a186a/openstack-network-exporter/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.003128 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd68p_b4670741-ddce-45cb-aa16-8f7c419f0c89/ovsdb-server-init/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.163753 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd68p_b4670741-ddce-45cb-aa16-8f7c419f0c89/ovsdb-server/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.189429 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd68p_b4670741-ddce-45cb-aa16-8f7c419f0c89/ovsdb-server-init/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.190038 4852 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5084ad3a-beee-4d5c-b13c-f9a5e17cd791"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5084ad3a-beee-4d5c-b13c-f9a5e17cd791] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5084ad3a_beee_4d5c_b13c_f9a5e17cd791.slice" Dec 10 12:56:34 crc kubenswrapper[4852]: E1210 12:56:34.190072 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod5084ad3a-beee-4d5c-b13c-f9a5e17cd791] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod5084ad3a-beee-4d5c-b13c-f9a5e17cd791] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5084ad3a_beee_4d5c_b13c_f9a5e17cd791.slice" pod="openshift-must-gather-24l7j/crc-debug-njd8j" podUID="5084ad3a-beee-4d5c-b13c-f9a5e17cd791" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.212951 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd68p_b4670741-ddce-45cb-aa16-8f7c419f0c89/ovs-vswitchd/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.451900 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9pkxx_438ab74a-135c-480f-9335-9e2f4f81c0c2/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.458686 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a4999290-010a-43e8-9622-04a117f98f3f/openstack-network-exporter/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.462541 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/crc-debug-njd8j" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.512740 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a4999290-010a-43e8-9622-04a117f98f3f/ovn-northd/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.686067 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0/openstack-network-exporter/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.765043 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0/ovsdbserver-nb/0.log" Dec 10 12:56:34 crc kubenswrapper[4852]: I1210 12:56:34.947260 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b16645b8-8fa6-46cc-848a-2815e736e9b2/openstack-network-exporter/0.log" Dec 10 12:56:35 crc kubenswrapper[4852]: I1210 12:56:35.082006 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b16645b8-8fa6-46cc-848a-2815e736e9b2/ovsdbserver-sb/0.log" Dec 10 12:56:35 crc kubenswrapper[4852]: I1210 12:56:35.119215 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db947f9b4-m6rgq_1ebd8c65-e675-462a-bdba-db5d0ea01754/placement-api/0.log" Dec 10 12:56:35 crc kubenswrapper[4852]: I1210 12:56:35.488797 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db947f9b4-m6rgq_1ebd8c65-e675-462a-bdba-db5d0ea01754/placement-log/0.log" Dec 10 12:56:35 crc kubenswrapper[4852]: I1210 12:56:35.533607 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_63001b32-e957-4b24-a742-7932191e7598/setup-container/0.log" Dec 10 12:56:35 crc kubenswrapper[4852]: I1210 12:56:35.766459 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_63001b32-e957-4b24-a742-7932191e7598/rabbitmq/0.log" Dec 10 12:56:35 crc kubenswrapper[4852]: I1210 12:56:35.824033 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_63001b32-e957-4b24-a742-7932191e7598/setup-container/0.log" Dec 10 12:56:35 crc kubenswrapper[4852]: I1210 12:56:35.831467 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_280ccc25-3ba2-46ea-b167-19480cb76a48/setup-container/0.log" Dec 10 12:56:36 crc kubenswrapper[4852]: I1210 12:56:36.095967 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_280ccc25-3ba2-46ea-b167-19480cb76a48/rabbitmq/0.log" Dec 10 12:56:36 crc kubenswrapper[4852]: I1210 12:56:36.104422 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_280ccc25-3ba2-46ea-b167-19480cb76a48/setup-container/0.log" Dec 10 12:56:36 crc kubenswrapper[4852]: I1210 12:56:36.210577 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r_f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:36 crc kubenswrapper[4852]: I1210 12:56:36.416898 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pf6kc_2bdc2caf-227b-4210-bdbd-adf085cf4e27/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:36 crc kubenswrapper[4852]: I1210 12:56:36.550117 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j_19836285-fe41-4d6e-8f05-b5aeac635c5c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:36 crc kubenswrapper[4852]: I1210 12:56:36.627645 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nlmsv_5d7d1222-768a-4615-8aaa-385740584e4e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:36 crc kubenswrapper[4852]: I1210 12:56:36.768833 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xxr4g_bc600c67-710c-494a-9fb0-866745c0709d/ssh-known-hosts-edpm-deployment/0.log" Dec 10 12:56:36 crc kubenswrapper[4852]: I1210 12:56:36.980006 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8698bf8cd7-bmf4z_a41546b5-9dd3-4400-97ba-4bf433dc2c2c/proxy-server/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.099488 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8698bf8cd7-bmf4z_a41546b5-9dd3-4400-97ba-4bf433dc2c2c/proxy-httpd/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.199460 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5w2l8_cce2fc32-02ab-4099-ac2f-c0eeca72f9a8/swift-ring-rebalance/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.357741 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/account-reaper/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.383760 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/account-auditor/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.491823 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/account-replicator/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.554146 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/account-server/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.640645 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/container-auditor/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.665576 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/container-replicator/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.731721 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/container-server/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.792060 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/container-updater/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.945693 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-replicator/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.962138 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-expirer/0.log" Dec 10 12:56:37 crc kubenswrapper[4852]: I1210 12:56:37.973866 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-auditor/0.log" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.027840 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-server/0.log" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.171903 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:56:38 crc kubenswrapper[4852]: E1210 12:56:38.172270 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.215200 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-updater/0.log" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.236062 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/swift-recon-cron/0.log" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.257258 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/rsync/0.log" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.522986 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb_33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.595982 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9238ddbd-fcdf-4612-974f-114508e02356/tempest-tests-tempest-tests-runner/0.log" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.844907 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ce2aca81-09ab-4dd7-b9b7-d35cef864a73/test-operator-logs-container/0.log" Dec 10 12:56:38 crc kubenswrapper[4852]: I1210 12:56:38.850875 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb_c8130005-2302-4ea1-8677-b590a256d3ec/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 12:56:47 crc kubenswrapper[4852]: I1210 12:56:47.430379 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7a324e51-4ea8-4cca-8cfd-6f64d13cd706/memcached/0.log" Dec 10 12:56:52 crc kubenswrapper[4852]: I1210 12:56:52.171096 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:56:52 crc kubenswrapper[4852]: E1210 12:56:52.172624 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.273170 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-bx58k_88a0620c-81a0-4ad1-ae9a-13eb0d08e10f/kube-rbac-proxy/0.log" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.388371 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-bx58k_88a0620c-81a0-4ad1-ae9a-13eb0d08e10f/manager/0.log" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.552222 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/util/0.log" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.672326 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/util/0.log" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.701706 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/pull/0.log" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.752971 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/pull/0.log" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.923137 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/util/0.log" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.926066 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/pull/0.log" Dec 10 12:57:03 crc kubenswrapper[4852]: I1210 12:57:03.952419 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/extract/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.097913 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-pnppk_97d20a41-52e0-47d5-86fd-0f486080ebf5/kube-rbac-proxy/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.134985 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-pnppk_97d20a41-52e0-47d5-86fd-0f486080ebf5/manager/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.182402 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-tlflj_74cd0e4c-bd25-4b22-8b1f-cb3758f446fd/kube-rbac-proxy/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.292489 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-tlflj_74cd0e4c-bd25-4b22-8b1f-cb3758f446fd/manager/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.368208 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-rhwzx_62e793a9-5b13-4532-90fe-d3313b3cf4d9/kube-rbac-proxy/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.517936 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-j8h26_391832bd-03d9-409e-93a0-b8986ed437ff/kube-rbac-proxy/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.560774 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-j8h26_391832bd-03d9-409e-93a0-b8986ed437ff/manager/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.657500 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-rhwzx_62e793a9-5b13-4532-90fe-d3313b3cf4d9/manager/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.765003 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-54gtc_f67c3362-3da1-45f6-8fc6-47e16b206173/kube-rbac-proxy/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.803389 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-54gtc_f67c3362-3da1-45f6-8fc6-47e16b206173/manager/0.log" Dec 10 12:57:04 crc kubenswrapper[4852]: I1210 12:57:04.987669 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5p988_3fc3907c-5313-44d8-90dd-155b24156a1b/kube-rbac-proxy/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.094792 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-gbgfx_67ab896e-72eb-4040-9397-2a2bcca37c7e/kube-rbac-proxy/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.170544 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:57:05 crc kubenswrapper[4852]: E1210 12:57:05.170778 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.193417 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5p988_3fc3907c-5313-44d8-90dd-155b24156a1b/manager/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.225314 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-gbgfx_67ab896e-72eb-4040-9397-2a2bcca37c7e/manager/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.290076 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4kkrb_2a5cb708-ca60-4763-bf61-6562a610e6dc/kube-rbac-proxy/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.446584 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4kkrb_2a5cb708-ca60-4763-bf61-6562a610e6dc/manager/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.465053 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-b2jw2_9c39ec89-c5bf-4cdd-a253-154db7bcf781/kube-rbac-proxy/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.510086 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-b2jw2_9c39ec89-c5bf-4cdd-a253-154db7bcf781/manager/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.670640 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-nlcxk_f53525dc-0dc9-44c5-a947-2e303cb0ed1c/manager/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.704378 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-nlcxk_f53525dc-0dc9-44c5-a947-2e303cb0ed1c/kube-rbac-proxy/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.800754 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pzw5d_79986568-4439-4f2a-9dc4-af5fb1a1d787/kube-rbac-proxy/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.922746 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pzw5d_79986568-4439-4f2a-9dc4-af5fb1a1d787/manager/0.log" Dec 10 12:57:05 crc kubenswrapper[4852]: I1210 12:57:05.974123 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p22mj_bf62d827-9a6d-4a53-9a65-b287195f3bea/kube-rbac-proxy/0.log" Dec 10 12:57:06 crc kubenswrapper[4852]: I1210 12:57:06.112200 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p22mj_bf62d827-9a6d-4a53-9a65-b287195f3bea/manager/0.log" Dec 10 12:57:06 crc kubenswrapper[4852]: I1210 12:57:06.193431 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-xr9c5_2a55ad46-c35b-4429-b1da-7a361f7c45d0/manager/0.log" Dec 10 12:57:06 crc kubenswrapper[4852]: I1210 12:57:06.209160 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-xr9c5_2a55ad46-c35b-4429-b1da-7a361f7c45d0/kube-rbac-proxy/0.log" Dec 10 12:57:06 crc kubenswrapper[4852]: I1210 12:57:06.365265 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f8csbt_3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2/kube-rbac-proxy/0.log" Dec 10 12:57:06 crc kubenswrapper[4852]: I1210 12:57:06.419099 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f8csbt_3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2/manager/0.log" Dec 10 12:57:06 crc kubenswrapper[4852]: I1210 12:57:06.808086 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nhqss_62488b9a-bd45-4d7e-a890-f2d585698d58/registry-server/0.log" Dec 10 12:57:06 crc kubenswrapper[4852]: I1210 12:57:06.810892 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5bc74577c9-ch9wf_267779dd-45a9-4ee6-985d-39fb7d7cb207/operator/0.log" Dec 10 12:57:06 crc kubenswrapper[4852]: I1210 12:57:06.993351 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-86mvp_75a5b678-ba48-4191-99e6-aeeaf32bf40e/kube-rbac-proxy/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.130519 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-86mvp_75a5b678-ba48-4191-99e6-aeeaf32bf40e/manager/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.254864 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-kc8c8_785eda15-0a5d-451d-8ec4-b35e1f8d8147/kube-rbac-proxy/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.338266 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-kc8c8_785eda15-0a5d-451d-8ec4-b35e1f8d8147/manager/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.429096 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xkcjc_524d7bc8-a871-4ff2-bc13-1a84d07bb0e9/operator/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.637586 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-q26ll_31cc1af5-d198-472a-aa62-2ce735f4453b/manager/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.675120 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-q26ll_31cc1af5-d198-472a-aa62-2ce735f4453b/kube-rbac-proxy/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.760468 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-lhzps_c7a73ae7-6060-497a-b94f-8988c2244f94/kube-rbac-proxy/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.896758 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-zhxkc_6b9c74bb-9c09-4976-be53-8b2c296f7788/kube-rbac-proxy/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.959924 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-zhxkc_6b9c74bb-9c09-4976-be53-8b2c296f7788/manager/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.978861 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-lhzps_c7a73ae7-6060-497a-b94f-8988c2244f94/manager/0.log" Dec 10 12:57:07 crc kubenswrapper[4852]: I1210 12:57:07.983668 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d7c94c9c8-s6npl_bbce747f-ad24-476e-8746-f2bb89eba637/manager/0.log" Dec 10 12:57:08 crc kubenswrapper[4852]: I1210 12:57:08.125920 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-57gb2_d01d86ae-5138-4298-8ec0-7aa8cdd468fe/kube-rbac-proxy/0.log" Dec 10 12:57:08 crc kubenswrapper[4852]: I1210 12:57:08.137089 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-57gb2_d01d86ae-5138-4298-8ec0-7aa8cdd468fe/manager/0.log" Dec 10 12:57:19 crc kubenswrapper[4852]: I1210 12:57:19.170461 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:57:19 crc kubenswrapper[4852]: E1210 12:57:19.171309 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:57:26 crc kubenswrapper[4852]: I1210 12:57:26.881037 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-n8gzr_8bc2ea7c-2f45-49ac-b683-c57d84d8e758/control-plane-machine-set-operator/0.log" Dec 10 12:57:27 crc kubenswrapper[4852]: I1210 12:57:27.087109 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gmm6c_c9c92825-dfcf-4030-8fa7-4326fc350f10/kube-rbac-proxy/0.log" Dec 10 12:57:27 crc kubenswrapper[4852]: I1210 12:57:27.103384 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gmm6c_c9c92825-dfcf-4030-8fa7-4326fc350f10/machine-api-operator/0.log" Dec 10 12:57:31 crc kubenswrapper[4852]: I1210 12:57:31.170159 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:57:31 crc kubenswrapper[4852]: E1210 12:57:31.170971 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:57:40 crc kubenswrapper[4852]: I1210 12:57:40.399336 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nt4zl_bb5429e3-7f2e-4632-b68b-18de65b5e060/cert-manager-controller/0.log" Dec 10 12:57:40 crc kubenswrapper[4852]: I1210 12:57:40.533355 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-z86h2_dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4/cert-manager-cainjector/0.log" Dec 10 12:57:40 crc kubenswrapper[4852]: I1210 12:57:40.559371 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-rj8fh_78c79d4e-6293-4789-932e-2c42545750a5/cert-manager-webhook/0.log" Dec 10 12:57:46 crc kubenswrapper[4852]: I1210 12:57:46.170655 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:57:46 crc kubenswrapper[4852]: E1210 12:57:46.171628 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:57:53 crc kubenswrapper[4852]: I1210 12:57:53.154616 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-zzw4g_88acf534-fb28-4e05-bab0-f60364533fae/nmstate-console-plugin/0.log" Dec 10 12:57:53 crc kubenswrapper[4852]: I1210 12:57:53.397493 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-2mcc9_94e816ec-cfe3-413c-98f4-5d6f2880d16f/kube-rbac-proxy/0.log" Dec 10 12:57:53 crc kubenswrapper[4852]: I1210 12:57:53.427207 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pcbgh_21f5475e-7988-44d7-940f-76c59cf92f7e/nmstate-handler/0.log" Dec 10 12:57:53 crc kubenswrapper[4852]: I1210 12:57:53.481431 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-2mcc9_94e816ec-cfe3-413c-98f4-5d6f2880d16f/nmstate-metrics/0.log" Dec 10 12:57:53 crc kubenswrapper[4852]: I1210 12:57:53.618044 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-s8bfr_35feaa98-be47-42f8-af3b-bf8a5ef57ce4/nmstate-operator/0.log" Dec 10 12:57:53 crc kubenswrapper[4852]: I1210 12:57:53.693773 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-hx5dz_5b77c48c-a8a1-440d-8e0d-fab8d2087ede/nmstate-webhook/0.log" Dec 10 12:57:57 crc kubenswrapper[4852]: I1210 12:57:57.171124 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:57:57 crc kubenswrapper[4852]: E1210 12:57:57.171953 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:58:07 crc kubenswrapper[4852]: I1210 12:58:07.926864 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gj6ss_2335158c-cbc5-45a0-9438-a879aede67f1/kube-rbac-proxy/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.072119 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-frr-files/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.153181 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gj6ss_2335158c-cbc5-45a0-9438-a879aede67f1/controller/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.221972 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-frr-files/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.284268 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-metrics/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.301661 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-reloader/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.320853 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-reloader/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.527212 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-metrics/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.550051 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-metrics/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.576614 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-reloader/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.577110 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-frr-files/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.754837 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-frr-files/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.796762 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-metrics/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.825773 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-reloader/0.log" Dec 10 12:58:08 crc kubenswrapper[4852]: I1210 12:58:08.916532 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/controller/0.log" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.013763 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/frr-metrics/0.log" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.021188 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/kube-rbac-proxy/0.log" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.153422 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/kube-rbac-proxy-frr/0.log" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.171021 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:58:09 crc kubenswrapper[4852]: E1210 12:58:09.171378 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.268632 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/reloader/0.log" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.376374 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-9cr6b_9c84ab71-bcb9-4237-a827-4fe3c1c2c754/frr-k8s-webhook-server/0.log" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.556838 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c67dbf94b-rhs8b_356ac40e-2e68-4d75-81ca-b1e3306e263a/manager/0.log" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.788721 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d78c58b5f-5mtdv_c92ae4cf-27c3-46d4-9be9-8398e1276f61/webhook-server/0.log" Dec 10 12:58:09 crc kubenswrapper[4852]: I1210 12:58:09.886601 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jfz84_357f1ff0-29a8-4905-bac8-9bc8a5c03199/kube-rbac-proxy/0.log" Dec 10 12:58:10 crc kubenswrapper[4852]: I1210 12:58:10.358205 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/frr/0.log" Dec 10 12:58:10 crc kubenswrapper[4852]: I1210 12:58:10.463060 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jfz84_357f1ff0-29a8-4905-bac8-9bc8a5c03199/speaker/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.069759 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/util/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.169457 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:58:23 crc kubenswrapper[4852]: E1210 12:58:23.169859 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.311760 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/util/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.366409 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/pull/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.366604 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/pull/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.490377 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/pull/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.533371 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/util/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.534113 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/extract/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.681783 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/util/0.log" Dec 10 12:58:23 crc kubenswrapper[4852]: I1210 12:58:23.862490 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/util/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.310485 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/pull/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.311093 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/pull/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.311113 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/pull/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.445743 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/util/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.490677 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/extract/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.541890 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nbwq2_a1c877d3-cccd-42c9-8f0b-2cc89e43b01d/extract-utilities/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.662377 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nbwq2_a1c877d3-cccd-42c9-8f0b-2cc89e43b01d/extract-content/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.684101 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nbwq2_a1c877d3-cccd-42c9-8f0b-2cc89e43b01d/extract-utilities/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.730103 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nbwq2_a1c877d3-cccd-42c9-8f0b-2cc89e43b01d/extract-content/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.917729 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nbwq2_a1c877d3-cccd-42c9-8f0b-2cc89e43b01d/extract-content/0.log" Dec 10 12:58:24 crc kubenswrapper[4852]: I1210 12:58:24.923324 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nbwq2_a1c877d3-cccd-42c9-8f0b-2cc89e43b01d/extract-utilities/0.log" Dec 10 12:58:25 crc kubenswrapper[4852]: I1210 12:58:25.165986 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5hzpk_8ad77dbc-86d2-4bbc-8312-4529077f52a6/extract-utilities/0.log" Dec 10 12:58:25 crc kubenswrapper[4852]: I1210 12:58:25.336082 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5hzpk_8ad77dbc-86d2-4bbc-8312-4529077f52a6/extract-content/0.log" Dec 10 12:58:25 crc kubenswrapper[4852]: I1210 12:58:25.375713 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5hzpk_8ad77dbc-86d2-4bbc-8312-4529077f52a6/extract-utilities/0.log" Dec 10 12:58:25 crc kubenswrapper[4852]: I1210 12:58:25.383164 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5hzpk_8ad77dbc-86d2-4bbc-8312-4529077f52a6/extract-content/0.log" Dec 10 12:58:25 crc kubenswrapper[4852]: I1210 12:58:25.431480 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nbwq2_a1c877d3-cccd-42c9-8f0b-2cc89e43b01d/registry-server/0.log" Dec 10 12:58:25 crc kubenswrapper[4852]: I1210 12:58:25.635221 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5hzpk_8ad77dbc-86d2-4bbc-8312-4529077f52a6/extract-content/0.log" Dec 10 12:58:25 crc kubenswrapper[4852]: I1210 12:58:25.688498 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5hzpk_8ad77dbc-86d2-4bbc-8312-4529077f52a6/extract-utilities/0.log" Dec 10 12:58:25 crc kubenswrapper[4852]: I1210 12:58:25.906113 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-64m8g_ff1e723c-986a-4c70-8340-aee0dacc330d/marketplace-operator/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.007860 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-utilities/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.266680 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-content/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.288195 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-utilities/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.323825 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-content/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.492815 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5hzpk_8ad77dbc-86d2-4bbc-8312-4529077f52a6/registry-server/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.514432 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-content/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.516530 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-utilities/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.761151 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-utilities/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.794199 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/registry-server/0.log" Dec 10 12:58:26 crc kubenswrapper[4852]: I1210 12:58:26.988169 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-utilities/0.log" Dec 10 12:58:27 crc kubenswrapper[4852]: I1210 12:58:27.018570 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-content/0.log" Dec 10 12:58:27 crc kubenswrapper[4852]: I1210 12:58:27.018570 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-content/0.log" Dec 10 12:58:27 crc kubenswrapper[4852]: I1210 12:58:27.199497 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-utilities/0.log" Dec 10 12:58:27 crc kubenswrapper[4852]: I1210 12:58:27.231024 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-content/0.log" Dec 10 12:58:28 crc kubenswrapper[4852]: I1210 12:58:28.439313 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/registry-server/0.log" Dec 10 12:58:38 crc kubenswrapper[4852]: I1210 12:58:38.170161 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:58:38 crc kubenswrapper[4852]: E1210 12:58:38.171022 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:58:52 crc kubenswrapper[4852]: I1210 12:58:52.170292 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:58:52 crc kubenswrapper[4852]: E1210 12:58:52.171211 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:58:52 crc kubenswrapper[4852]: E1210 12:58:52.827933 4852 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.73:49656->38.102.83.73:42555: write tcp 38.102.83.73:49656->38.102.83.73:42555: write: broken pipe Dec 10 12:59:04 crc kubenswrapper[4852]: I1210 12:59:04.176077 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:59:04 crc kubenswrapper[4852]: E1210 12:59:04.177014 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:59:13 crc kubenswrapper[4852]: I1210 12:59:13.874065 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2872c"] Dec 10 12:59:13 crc kubenswrapper[4852]: E1210 12:59:13.875180 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bff4cf-aac4-4ddd-8028-e49dda6a2b31" containerName="container-00" Dec 10 12:59:13 crc kubenswrapper[4852]: I1210 12:59:13.875199 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bff4cf-aac4-4ddd-8028-e49dda6a2b31" containerName="container-00" Dec 10 12:59:13 crc kubenswrapper[4852]: I1210 12:59:13.875445 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9bff4cf-aac4-4ddd-8028-e49dda6a2b31" containerName="container-00" Dec 10 12:59:13 crc kubenswrapper[4852]: I1210 12:59:13.876912 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:13 crc kubenswrapper[4852]: I1210 12:59:13.895042 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2872c"] Dec 10 12:59:13 crc kubenswrapper[4852]: I1210 12:59:13.940709 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9151bb1d-ba24-436f-a64f-a40292d34e64-utilities\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:13 crc kubenswrapper[4852]: I1210 12:59:13.940834 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5cct\" (UniqueName: \"kubernetes.io/projected/9151bb1d-ba24-436f-a64f-a40292d34e64-kube-api-access-g5cct\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:13 crc kubenswrapper[4852]: I1210 12:59:13.940972 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9151bb1d-ba24-436f-a64f-a40292d34e64-catalog-content\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:14 crc kubenswrapper[4852]: I1210 12:59:14.042569 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9151bb1d-ba24-436f-a64f-a40292d34e64-catalog-content\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:14 crc kubenswrapper[4852]: I1210 12:59:14.042669 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9151bb1d-ba24-436f-a64f-a40292d34e64-utilities\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:14 crc kubenswrapper[4852]: I1210 12:59:14.042759 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5cct\" (UniqueName: \"kubernetes.io/projected/9151bb1d-ba24-436f-a64f-a40292d34e64-kube-api-access-g5cct\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:14 crc kubenswrapper[4852]: I1210 12:59:14.043333 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9151bb1d-ba24-436f-a64f-a40292d34e64-catalog-content\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:14 crc kubenswrapper[4852]: I1210 12:59:14.043413 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9151bb1d-ba24-436f-a64f-a40292d34e64-utilities\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:14 crc kubenswrapper[4852]: I1210 12:59:14.066264 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5cct\" (UniqueName: \"kubernetes.io/projected/9151bb1d-ba24-436f-a64f-a40292d34e64-kube-api-access-g5cct\") pod \"certified-operators-2872c\" (UID: \"9151bb1d-ba24-436f-a64f-a40292d34e64\") " pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:14 crc kubenswrapper[4852]: I1210 12:59:14.212684 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:14 crc kubenswrapper[4852]: I1210 12:59:14.831514 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2872c"] Dec 10 12:59:15 crc kubenswrapper[4852]: I1210 12:59:15.070390 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2872c" event={"ID":"9151bb1d-ba24-436f-a64f-a40292d34e64","Type":"ContainerStarted","Data":"c9d9063a55963688df29f38c84d7fc965fd2a6e80088a8505e44d5708d892276"} Dec 10 12:59:16 crc kubenswrapper[4852]: I1210 12:59:16.080588 4852 generic.go:334] "Generic (PLEG): container finished" podID="9151bb1d-ba24-436f-a64f-a40292d34e64" containerID="ed7e508525326277eccfc9c5651e41a480de53d14a030ea2465cda0024754c4b" exitCode=0 Dec 10 12:59:16 crc kubenswrapper[4852]: I1210 12:59:16.080696 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2872c" event={"ID":"9151bb1d-ba24-436f-a64f-a40292d34e64","Type":"ContainerDied","Data":"ed7e508525326277eccfc9c5651e41a480de53d14a030ea2465cda0024754c4b"} Dec 10 12:59:16 crc kubenswrapper[4852]: I1210 12:59:16.170369 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:59:16 crc kubenswrapper[4852]: E1210 12:59:16.170773 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.257682 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-79z9b"] Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.260009 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.266819 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79z9b"] Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.443813 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-catalog-content\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.444031 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6dw7\" (UniqueName: \"kubernetes.io/projected/4e09fef1-e443-44f1-a73c-9125d01fd960-kube-api-access-x6dw7\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.444115 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-utilities\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.546726 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-catalog-content\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.546820 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6dw7\" (UniqueName: \"kubernetes.io/projected/4e09fef1-e443-44f1-a73c-9125d01fd960-kube-api-access-x6dw7\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.546883 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-utilities\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.547615 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-utilities\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.547712 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-catalog-content\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.573657 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6dw7\" (UniqueName: \"kubernetes.io/projected/4e09fef1-e443-44f1-a73c-9125d01fd960-kube-api-access-x6dw7\") pod \"redhat-marketplace-79z9b\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:18 crc kubenswrapper[4852]: I1210 12:59:18.580979 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:19 crc kubenswrapper[4852]: I1210 12:59:19.098081 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79z9b"] Dec 10 12:59:20 crc kubenswrapper[4852]: I1210 12:59:20.116590 4852 generic.go:334] "Generic (PLEG): container finished" podID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerID="0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335" exitCode=0 Dec 10 12:59:20 crc kubenswrapper[4852]: I1210 12:59:20.116807 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79z9b" event={"ID":"4e09fef1-e443-44f1-a73c-9125d01fd960","Type":"ContainerDied","Data":"0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335"} Dec 10 12:59:20 crc kubenswrapper[4852]: I1210 12:59:20.117197 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79z9b" event={"ID":"4e09fef1-e443-44f1-a73c-9125d01fd960","Type":"ContainerStarted","Data":"3a9bb62531e525a29068b317d7920b5bb700a63681f3044039d5a074dd249ca6"} Dec 10 12:59:21 crc kubenswrapper[4852]: I1210 12:59:21.263810 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 12:59:22 crc kubenswrapper[4852]: I1210 12:59:22.137551 4852 generic.go:334] "Generic (PLEG): container finished" podID="9151bb1d-ba24-436f-a64f-a40292d34e64" containerID="0f222e4a435953e5911b35973fdde228fd80bc5b609bfebae274c6f73f0ba053" exitCode=0 Dec 10 12:59:22 crc kubenswrapper[4852]: I1210 12:59:22.137912 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2872c" event={"ID":"9151bb1d-ba24-436f-a64f-a40292d34e64","Type":"ContainerDied","Data":"0f222e4a435953e5911b35973fdde228fd80bc5b609bfebae274c6f73f0ba053"} Dec 10 12:59:23 crc kubenswrapper[4852]: I1210 12:59:23.152573 4852 generic.go:334] "Generic (PLEG): container finished" podID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerID="76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472" exitCode=0 Dec 10 12:59:23 crc kubenswrapper[4852]: I1210 12:59:23.152696 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79z9b" event={"ID":"4e09fef1-e443-44f1-a73c-9125d01fd960","Type":"ContainerDied","Data":"76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472"} Dec 10 12:59:23 crc kubenswrapper[4852]: I1210 12:59:23.206524 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2872c" podStartSLOduration=3.627612516 podStartE2EDuration="10.206489813s" podCreationTimestamp="2025-12-10 12:59:13 +0000 UTC" firstStartedPulling="2025-12-10 12:59:16.082700857 +0000 UTC m=+4042.168226081" lastFinishedPulling="2025-12-10 12:59:22.661578144 +0000 UTC m=+4048.747103378" observedRunningTime="2025-12-10 12:59:23.193578382 +0000 UTC m=+4049.279103606" watchObservedRunningTime="2025-12-10 12:59:23.206489813 +0000 UTC m=+4049.292015037" Dec 10 12:59:24 crc kubenswrapper[4852]: I1210 12:59:24.194511 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2872c" event={"ID":"9151bb1d-ba24-436f-a64f-a40292d34e64","Type":"ContainerStarted","Data":"5495dbb8c913bb489c2e8f9491cbfed8963b5450fb3d048a39678ec86748f559"} Dec 10 12:59:24 crc kubenswrapper[4852]: I1210 12:59:24.198211 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79z9b" event={"ID":"4e09fef1-e443-44f1-a73c-9125d01fd960","Type":"ContainerStarted","Data":"d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885"} Dec 10 12:59:24 crc kubenswrapper[4852]: I1210 12:59:24.212900 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:24 crc kubenswrapper[4852]: I1210 12:59:24.212962 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:24 crc kubenswrapper[4852]: I1210 12:59:24.225440 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-79z9b" podStartSLOduration=3.562821224 podStartE2EDuration="6.225416127s" podCreationTimestamp="2025-12-10 12:59:18 +0000 UTC" firstStartedPulling="2025-12-10 12:59:21.263499086 +0000 UTC m=+4047.349024320" lastFinishedPulling="2025-12-10 12:59:23.926093999 +0000 UTC m=+4050.011619223" observedRunningTime="2025-12-10 12:59:24.216070224 +0000 UTC m=+4050.301595458" watchObservedRunningTime="2025-12-10 12:59:24.225416127 +0000 UTC m=+4050.310941361" Dec 10 12:59:25 crc kubenswrapper[4852]: I1210 12:59:25.266509 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2872c" podUID="9151bb1d-ba24-436f-a64f-a40292d34e64" containerName="registry-server" probeResult="failure" output=< Dec 10 12:59:25 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 12:59:25 crc kubenswrapper[4852]: > Dec 10 12:59:28 crc kubenswrapper[4852]: I1210 12:59:28.582623 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:28 crc kubenswrapper[4852]: I1210 12:59:28.583335 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:28 crc kubenswrapper[4852]: I1210 12:59:28.662558 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:29 crc kubenswrapper[4852]: I1210 12:59:29.598762 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:29 crc kubenswrapper[4852]: I1210 12:59:29.657500 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79z9b"] Dec 10 12:59:31 crc kubenswrapper[4852]: I1210 12:59:31.169900 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:59:31 crc kubenswrapper[4852]: E1210 12:59:31.170182 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:59:31 crc kubenswrapper[4852]: I1210 12:59:31.564400 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-79z9b" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerName="registry-server" containerID="cri-o://d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885" gracePeriod=2 Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.074068 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.241362 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6dw7\" (UniqueName: \"kubernetes.io/projected/4e09fef1-e443-44f1-a73c-9125d01fd960-kube-api-access-x6dw7\") pod \"4e09fef1-e443-44f1-a73c-9125d01fd960\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.241912 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-utilities\") pod \"4e09fef1-e443-44f1-a73c-9125d01fd960\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.242004 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-catalog-content\") pod \"4e09fef1-e443-44f1-a73c-9125d01fd960\" (UID: \"4e09fef1-e443-44f1-a73c-9125d01fd960\") " Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.243138 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-utilities" (OuterVolumeSpecName: "utilities") pod "4e09fef1-e443-44f1-a73c-9125d01fd960" (UID: "4e09fef1-e443-44f1-a73c-9125d01fd960"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.246329 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e09fef1-e443-44f1-a73c-9125d01fd960-kube-api-access-x6dw7" (OuterVolumeSpecName: "kube-api-access-x6dw7") pod "4e09fef1-e443-44f1-a73c-9125d01fd960" (UID: "4e09fef1-e443-44f1-a73c-9125d01fd960"). InnerVolumeSpecName "kube-api-access-x6dw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.263710 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e09fef1-e443-44f1-a73c-9125d01fd960" (UID: "4e09fef1-e443-44f1-a73c-9125d01fd960"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.344889 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.344933 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6dw7\" (UniqueName: \"kubernetes.io/projected/4e09fef1-e443-44f1-a73c-9125d01fd960-kube-api-access-x6dw7\") on node \"crc\" DevicePath \"\"" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.344948 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e09fef1-e443-44f1-a73c-9125d01fd960-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.576383 4852 generic.go:334] "Generic (PLEG): container finished" podID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerID="d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885" exitCode=0 Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.576437 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79z9b" event={"ID":"4e09fef1-e443-44f1-a73c-9125d01fd960","Type":"ContainerDied","Data":"d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885"} Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.576463 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79z9b" event={"ID":"4e09fef1-e443-44f1-a73c-9125d01fd960","Type":"ContainerDied","Data":"3a9bb62531e525a29068b317d7920b5bb700a63681f3044039d5a074dd249ca6"} Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.576482 4852 scope.go:117] "RemoveContainer" containerID="d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.576564 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79z9b" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.606495 4852 scope.go:117] "RemoveContainer" containerID="76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.632471 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79z9b"] Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.641326 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-79z9b"] Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.651349 4852 scope.go:117] "RemoveContainer" containerID="0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.703149 4852 scope.go:117] "RemoveContainer" containerID="d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885" Dec 10 12:59:32 crc kubenswrapper[4852]: E1210 12:59:32.703674 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885\": container with ID starting with d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885 not found: ID does not exist" containerID="d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.703723 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885"} err="failed to get container status \"d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885\": rpc error: code = NotFound desc = could not find container \"d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885\": container with ID starting with d71e105185d4cc1a671ba02c2bd0eeaebe23504a8d3a1cc69915d0a7b43c2885 not found: ID does not exist" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.703755 4852 scope.go:117] "RemoveContainer" containerID="76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472" Dec 10 12:59:32 crc kubenswrapper[4852]: E1210 12:59:32.704007 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472\": container with ID starting with 76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472 not found: ID does not exist" containerID="76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.704032 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472"} err="failed to get container status \"76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472\": rpc error: code = NotFound desc = could not find container \"76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472\": container with ID starting with 76a63c76c503a7de35309286fd2ab2a45f339c514e0e81e218a2b9fffb3fe472 not found: ID does not exist" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.704047 4852 scope.go:117] "RemoveContainer" containerID="0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335" Dec 10 12:59:32 crc kubenswrapper[4852]: E1210 12:59:32.704297 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335\": container with ID starting with 0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335 not found: ID does not exist" containerID="0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335" Dec 10 12:59:32 crc kubenswrapper[4852]: I1210 12:59:32.704322 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335"} err="failed to get container status \"0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335\": rpc error: code = NotFound desc = could not find container \"0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335\": container with ID starting with 0b07f248aaa1cab451eb66b83c867234d6755aae03288162e74a566281c73335 not found: ID does not exist" Dec 10 12:59:34 crc kubenswrapper[4852]: I1210 12:59:34.185265 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" path="/var/lib/kubelet/pods/4e09fef1-e443-44f1-a73c-9125d01fd960/volumes" Dec 10 12:59:34 crc kubenswrapper[4852]: I1210 12:59:34.716045 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:34 crc kubenswrapper[4852]: I1210 12:59:34.782277 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2872c" Dec 10 12:59:35 crc kubenswrapper[4852]: I1210 12:59:35.195908 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2872c"] Dec 10 12:59:35 crc kubenswrapper[4852]: I1210 12:59:35.350584 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nbwq2"] Dec 10 12:59:35 crc kubenswrapper[4852]: I1210 12:59:35.350854 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nbwq2" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerName="registry-server" containerID="cri-o://9dcf5a7658bdef305e82063b4011a5f908ccb533e4139a2f372225399dd50dd4" gracePeriod=2 Dec 10 12:59:36 crc kubenswrapper[4852]: I1210 12:59:36.646199 4852 generic.go:334] "Generic (PLEG): container finished" podID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerID="9dcf5a7658bdef305e82063b4011a5f908ccb533e4139a2f372225399dd50dd4" exitCode=0 Dec 10 12:59:36 crc kubenswrapper[4852]: I1210 12:59:36.646293 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbwq2" event={"ID":"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d","Type":"ContainerDied","Data":"9dcf5a7658bdef305e82063b4011a5f908ccb533e4139a2f372225399dd50dd4"} Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.241496 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.368071 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-catalog-content\") pod \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.368225 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-utilities\") pod \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.368358 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsbjb\" (UniqueName: \"kubernetes.io/projected/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-kube-api-access-nsbjb\") pod \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\" (UID: \"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d\") " Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.370059 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-utilities" (OuterVolumeSpecName: "utilities") pod "a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" (UID: "a1c877d3-cccd-42c9-8f0b-2cc89e43b01d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.376337 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-kube-api-access-nsbjb" (OuterVolumeSpecName: "kube-api-access-nsbjb") pod "a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" (UID: "a1c877d3-cccd-42c9-8f0b-2cc89e43b01d"). InnerVolumeSpecName "kube-api-access-nsbjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.439835 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" (UID: "a1c877d3-cccd-42c9-8f0b-2cc89e43b01d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.470770 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.470814 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsbjb\" (UniqueName: \"kubernetes.io/projected/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-kube-api-access-nsbjb\") on node \"crc\" DevicePath \"\"" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.470832 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.658829 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbwq2" event={"ID":"a1c877d3-cccd-42c9-8f0b-2cc89e43b01d","Type":"ContainerDied","Data":"82658192db401aa6d526c1964ce59171b12a52e47fb8af0223390e04b22da256"} Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.658895 4852 scope.go:117] "RemoveContainer" containerID="9dcf5a7658bdef305e82063b4011a5f908ccb533e4139a2f372225399dd50dd4" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.658939 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nbwq2" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.704702 4852 scope.go:117] "RemoveContainer" containerID="2cb4645019933d1bd04ea9ca34b160e0e88891cbe74d3362f93f60aa9fae30e3" Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.706035 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nbwq2"] Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.713416 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nbwq2"] Dec 10 12:59:37 crc kubenswrapper[4852]: I1210 12:59:37.727471 4852 scope.go:117] "RemoveContainer" containerID="416afe29d58a33c6f40f8a87090ea3f2e86be86c58cad05159685bb11fb1a8ab" Dec 10 12:59:38 crc kubenswrapper[4852]: I1210 12:59:38.183465 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" path="/var/lib/kubelet/pods/a1c877d3-cccd-42c9-8f0b-2cc89e43b01d/volumes" Dec 10 12:59:44 crc kubenswrapper[4852]: I1210 12:59:44.179727 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:59:44 crc kubenswrapper[4852]: E1210 12:59:44.181048 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 12:59:59 crc kubenswrapper[4852]: I1210 12:59:59.169774 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 12:59:59 crc kubenswrapper[4852]: E1210 12:59:59.170795 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.195098 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v"] Dec 10 13:00:00 crc kubenswrapper[4852]: E1210 13:00:00.195776 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerName="extract-content" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.195792 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerName="extract-content" Dec 10 13:00:00 crc kubenswrapper[4852]: E1210 13:00:00.195816 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerName="registry-server" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.195824 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerName="registry-server" Dec 10 13:00:00 crc kubenswrapper[4852]: E1210 13:00:00.195837 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerName="extract-content" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.195846 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerName="extract-content" Dec 10 13:00:00 crc kubenswrapper[4852]: E1210 13:00:00.195864 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerName="registry-server" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.195871 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerName="registry-server" Dec 10 13:00:00 crc kubenswrapper[4852]: E1210 13:00:00.195897 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerName="extract-utilities" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.195906 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerName="extract-utilities" Dec 10 13:00:00 crc kubenswrapper[4852]: E1210 13:00:00.195916 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerName="extract-utilities" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.195923 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerName="extract-utilities" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.196149 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e09fef1-e443-44f1-a73c-9125d01fd960" containerName="registry-server" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.196189 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c877d3-cccd-42c9-8f0b-2cc89e43b01d" containerName="registry-server" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.196978 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v"] Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.197132 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.200801 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.201056 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.263307 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5kz\" (UniqueName: \"kubernetes.io/projected/e14da2e8-1aff-4adf-971f-3b6bb65aea65-kube-api-access-4n5kz\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.263405 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e14da2e8-1aff-4adf-971f-3b6bb65aea65-config-volume\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.263690 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e14da2e8-1aff-4adf-971f-3b6bb65aea65-secret-volume\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.366105 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e14da2e8-1aff-4adf-971f-3b6bb65aea65-secret-volume\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.366588 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n5kz\" (UniqueName: \"kubernetes.io/projected/e14da2e8-1aff-4adf-971f-3b6bb65aea65-kube-api-access-4n5kz\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.366630 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e14da2e8-1aff-4adf-971f-3b6bb65aea65-config-volume\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.367666 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e14da2e8-1aff-4adf-971f-3b6bb65aea65-config-volume\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.379165 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e14da2e8-1aff-4adf-971f-3b6bb65aea65-secret-volume\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.381829 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n5kz\" (UniqueName: \"kubernetes.io/projected/e14da2e8-1aff-4adf-971f-3b6bb65aea65-kube-api-access-4n5kz\") pod \"collect-profiles-29422860-qln6v\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.522106 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:00 crc kubenswrapper[4852]: I1210 13:00:00.996577 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v"] Dec 10 13:00:01 crc kubenswrapper[4852]: W1210 13:00:01.568912 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode14da2e8_1aff_4adf_971f_3b6bb65aea65.slice/crio-2b637f9958cbfe942247c5577e595f6c9edc76d61778e3982589f7b656cc5d8e WatchSource:0}: Error finding container 2b637f9958cbfe942247c5577e595f6c9edc76d61778e3982589f7b656cc5d8e: Status 404 returned error can't find the container with id 2b637f9958cbfe942247c5577e595f6c9edc76d61778e3982589f7b656cc5d8e Dec 10 13:00:01 crc kubenswrapper[4852]: I1210 13:00:01.930353 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" event={"ID":"e14da2e8-1aff-4adf-971f-3b6bb65aea65","Type":"ContainerStarted","Data":"60ce5770cb05481a2da07844cf7d0c329b3c6f82c0072c0b48a4cda29d4102e6"} Dec 10 13:00:01 crc kubenswrapper[4852]: I1210 13:00:01.930694 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" event={"ID":"e14da2e8-1aff-4adf-971f-3b6bb65aea65","Type":"ContainerStarted","Data":"2b637f9958cbfe942247c5577e595f6c9edc76d61778e3982589f7b656cc5d8e"} Dec 10 13:00:01 crc kubenswrapper[4852]: I1210 13:00:01.948049 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" podStartSLOduration=1.948012665 podStartE2EDuration="1.948012665s" podCreationTimestamp="2025-12-10 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 13:00:01.944814755 +0000 UTC m=+4088.030339979" watchObservedRunningTime="2025-12-10 13:00:01.948012665 +0000 UTC m=+4088.033537909" Dec 10 13:00:02 crc kubenswrapper[4852]: I1210 13:00:02.953136 4852 generic.go:334] "Generic (PLEG): container finished" podID="e14da2e8-1aff-4adf-971f-3b6bb65aea65" containerID="60ce5770cb05481a2da07844cf7d0c329b3c6f82c0072c0b48a4cda29d4102e6" exitCode=0 Dec 10 13:00:02 crc kubenswrapper[4852]: I1210 13:00:02.953203 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" event={"ID":"e14da2e8-1aff-4adf-971f-3b6bb65aea65","Type":"ContainerDied","Data":"60ce5770cb05481a2da07844cf7d0c329b3c6f82c0072c0b48a4cda29d4102e6"} Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.343689 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.369301 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e14da2e8-1aff-4adf-971f-3b6bb65aea65-config-volume\") pod \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.369614 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n5kz\" (UniqueName: \"kubernetes.io/projected/e14da2e8-1aff-4adf-971f-3b6bb65aea65-kube-api-access-4n5kz\") pod \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.369807 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e14da2e8-1aff-4adf-971f-3b6bb65aea65-secret-volume\") pod \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\" (UID: \"e14da2e8-1aff-4adf-971f-3b6bb65aea65\") " Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.369941 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e14da2e8-1aff-4adf-971f-3b6bb65aea65-config-volume" (OuterVolumeSpecName: "config-volume") pod "e14da2e8-1aff-4adf-971f-3b6bb65aea65" (UID: "e14da2e8-1aff-4adf-971f-3b6bb65aea65"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.370642 4852 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e14da2e8-1aff-4adf-971f-3b6bb65aea65-config-volume\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.375130 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14da2e8-1aff-4adf-971f-3b6bb65aea65-kube-api-access-4n5kz" (OuterVolumeSpecName: "kube-api-access-4n5kz") pod "e14da2e8-1aff-4adf-971f-3b6bb65aea65" (UID: "e14da2e8-1aff-4adf-971f-3b6bb65aea65"). InnerVolumeSpecName "kube-api-access-4n5kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.378270 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14da2e8-1aff-4adf-971f-3b6bb65aea65-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e14da2e8-1aff-4adf-971f-3b6bb65aea65" (UID: "e14da2e8-1aff-4adf-971f-3b6bb65aea65"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.472583 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n5kz\" (UniqueName: \"kubernetes.io/projected/e14da2e8-1aff-4adf-971f-3b6bb65aea65-kube-api-access-4n5kz\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.472626 4852 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e14da2e8-1aff-4adf-971f-3b6bb65aea65-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.981042 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" event={"ID":"e14da2e8-1aff-4adf-971f-3b6bb65aea65","Type":"ContainerDied","Data":"2b637f9958cbfe942247c5577e595f6c9edc76d61778e3982589f7b656cc5d8e"} Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.981391 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b637f9958cbfe942247c5577e595f6c9edc76d61778e3982589f7b656cc5d8e" Dec 10 13:00:04 crc kubenswrapper[4852]: I1210 13:00:04.981115 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29422860-qln6v" Dec 10 13:00:05 crc kubenswrapper[4852]: I1210 13:00:05.034461 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4"] Dec 10 13:00:05 crc kubenswrapper[4852]: I1210 13:00:05.044885 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29422815-pvrg4"] Dec 10 13:00:06 crc kubenswrapper[4852]: I1210 13:00:06.183662 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07657eb-924e-43f3-b088-c851f8c62424" path="/var/lib/kubelet/pods/f07657eb-924e-43f3-b088-c851f8c62424/volumes" Dec 10 13:00:06 crc kubenswrapper[4852]: I1210 13:00:06.714963 4852 scope.go:117] "RemoveContainer" containerID="5cd5645fea72c98047c9c3dcfa0b983b1a975a5e273e345550e7cc606fec5f2e" Dec 10 13:00:13 crc kubenswrapper[4852]: I1210 13:00:13.169685 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 13:00:13 crc kubenswrapper[4852]: E1210 13:00:13.170287 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:00:20 crc kubenswrapper[4852]: I1210 13:00:20.149305 4852 generic.go:334] "Generic (PLEG): container finished" podID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerID="d8d3fc60cfaabdc7b1ccf9642e47aa5c5ab68e2f66fa10ecaeb1859856f51960" exitCode=0 Dec 10 13:00:20 crc kubenswrapper[4852]: I1210 13:00:20.149431 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24l7j/must-gather-h899f" event={"ID":"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f","Type":"ContainerDied","Data":"d8d3fc60cfaabdc7b1ccf9642e47aa5c5ab68e2f66fa10ecaeb1859856f51960"} Dec 10 13:00:20 crc kubenswrapper[4852]: I1210 13:00:20.150438 4852 scope.go:117] "RemoveContainer" containerID="d8d3fc60cfaabdc7b1ccf9642e47aa5c5ab68e2f66fa10ecaeb1859856f51960" Dec 10 13:00:20 crc kubenswrapper[4852]: I1210 13:00:20.581729 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-24l7j_must-gather-h899f_86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f/gather/0.log" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.331878 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xk9cn"] Dec 10 13:00:23 crc kubenswrapper[4852]: E1210 13:00:23.336140 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14da2e8-1aff-4adf-971f-3b6bb65aea65" containerName="collect-profiles" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.336182 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14da2e8-1aff-4adf-971f-3b6bb65aea65" containerName="collect-profiles" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.336508 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14da2e8-1aff-4adf-971f-3b6bb65aea65" containerName="collect-profiles" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.338352 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.369893 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xk9cn"] Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.483289 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s2sg\" (UniqueName: \"kubernetes.io/projected/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-kube-api-access-4s2sg\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.483363 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-utilities\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.483454 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-catalog-content\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.585751 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s2sg\" (UniqueName: \"kubernetes.io/projected/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-kube-api-access-4s2sg\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.585838 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-utilities\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.585911 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-catalog-content\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.586525 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-utilities\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.586530 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-catalog-content\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.616414 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s2sg\" (UniqueName: \"kubernetes.io/projected/69e59a92-b2e5-41c5-ba51-6d8a67b08da1-kube-api-access-4s2sg\") pod \"community-operators-xk9cn\" (UID: \"69e59a92-b2e5-41c5-ba51-6d8a67b08da1\") " pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:23 crc kubenswrapper[4852]: I1210 13:00:23.684060 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:24 crc kubenswrapper[4852]: I1210 13:00:24.018395 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xk9cn"] Dec 10 13:00:24 crc kubenswrapper[4852]: I1210 13:00:24.202609 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk9cn" event={"ID":"69e59a92-b2e5-41c5-ba51-6d8a67b08da1","Type":"ContainerStarted","Data":"45c91ef42c80c9b66a0a475c57981a359c7d9987256735a96b3e6a6a93c5a88c"} Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.213574 4852 generic.go:334] "Generic (PLEG): container finished" podID="69e59a92-b2e5-41c5-ba51-6d8a67b08da1" containerID="842ca87d2904323455d32b7a0a105ba26d3ae3335ef39bbb988921a127ac4860" exitCode=0 Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.213661 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk9cn" event={"ID":"69e59a92-b2e5-41c5-ba51-6d8a67b08da1","Type":"ContainerDied","Data":"842ca87d2904323455d32b7a0a105ba26d3ae3335ef39bbb988921a127ac4860"} Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.323997 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2btlg"] Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.326513 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.330344 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-catalog-content\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.330417 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-utilities\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.330471 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcqlr\" (UniqueName: \"kubernetes.io/projected/b94f86ba-a27e-4278-87e1-afbe2e73df15-kube-api-access-jcqlr\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.336827 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2btlg"] Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.432723 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-catalog-content\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.432786 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-utilities\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.432833 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcqlr\" (UniqueName: \"kubernetes.io/projected/b94f86ba-a27e-4278-87e1-afbe2e73df15-kube-api-access-jcqlr\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.433259 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-catalog-content\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.433452 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-utilities\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.455597 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcqlr\" (UniqueName: \"kubernetes.io/projected/b94f86ba-a27e-4278-87e1-afbe2e73df15-kube-api-access-jcqlr\") pod \"redhat-operators-2btlg\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:25 crc kubenswrapper[4852]: I1210 13:00:25.660690 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:26 crc kubenswrapper[4852]: W1210 13:00:26.168386 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb94f86ba_a27e_4278_87e1_afbe2e73df15.slice/crio-b9fb3c5c0d4a620bbb68b30ebdf2ff068eff05683750dbab1afbd63604521f1f WatchSource:0}: Error finding container b9fb3c5c0d4a620bbb68b30ebdf2ff068eff05683750dbab1afbd63604521f1f: Status 404 returned error can't find the container with id b9fb3c5c0d4a620bbb68b30ebdf2ff068eff05683750dbab1afbd63604521f1f Dec 10 13:00:26 crc kubenswrapper[4852]: I1210 13:00:26.170736 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 13:00:26 crc kubenswrapper[4852]: E1210 13:00:26.171045 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:00:26 crc kubenswrapper[4852]: I1210 13:00:26.189118 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2btlg"] Dec 10 13:00:26 crc kubenswrapper[4852]: I1210 13:00:26.226402 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2btlg" event={"ID":"b94f86ba-a27e-4278-87e1-afbe2e73df15","Type":"ContainerStarted","Data":"b9fb3c5c0d4a620bbb68b30ebdf2ff068eff05683750dbab1afbd63604521f1f"} Dec 10 13:00:27 crc kubenswrapper[4852]: I1210 13:00:27.237074 4852 generic.go:334] "Generic (PLEG): container finished" podID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerID="cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e" exitCode=0 Dec 10 13:00:27 crc kubenswrapper[4852]: I1210 13:00:27.237170 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2btlg" event={"ID":"b94f86ba-a27e-4278-87e1-afbe2e73df15","Type":"ContainerDied","Data":"cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e"} Dec 10 13:00:28 crc kubenswrapper[4852]: I1210 13:00:28.396645 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-24l7j/must-gather-h899f"] Dec 10 13:00:28 crc kubenswrapper[4852]: I1210 13:00:28.397182 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-24l7j/must-gather-h899f" podUID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerName="copy" containerID="cri-o://f1350aad3f798dd717c06816140a8c8d3a9c7df8a61aa6e74a8084c2714e0662" gracePeriod=2 Dec 10 13:00:28 crc kubenswrapper[4852]: I1210 13:00:28.405333 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-24l7j/must-gather-h899f"] Dec 10 13:00:29 crc kubenswrapper[4852]: I1210 13:00:29.264001 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-24l7j_must-gather-h899f_86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f/copy/0.log" Dec 10 13:00:29 crc kubenswrapper[4852]: I1210 13:00:29.264647 4852 generic.go:334] "Generic (PLEG): container finished" podID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerID="f1350aad3f798dd717c06816140a8c8d3a9c7df8a61aa6e74a8084c2714e0662" exitCode=143 Dec 10 13:00:30 crc kubenswrapper[4852]: I1210 13:00:30.729335 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-24l7j_must-gather-h899f_86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f/copy/0.log" Dec 10 13:00:30 crc kubenswrapper[4852]: I1210 13:00:30.730571 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 13:00:30 crc kubenswrapper[4852]: I1210 13:00:30.793227 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-must-gather-output\") pod \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\" (UID: \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\") " Dec 10 13:00:30 crc kubenswrapper[4852]: I1210 13:00:30.793368 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68j8h\" (UniqueName: \"kubernetes.io/projected/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-kube-api-access-68j8h\") pod \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\" (UID: \"86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f\") " Dec 10 13:00:30 crc kubenswrapper[4852]: I1210 13:00:30.830262 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-kube-api-access-68j8h" (OuterVolumeSpecName: "kube-api-access-68j8h") pod "86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" (UID: "86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f"). InnerVolumeSpecName "kube-api-access-68j8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:00:30 crc kubenswrapper[4852]: I1210 13:00:30.895834 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68j8h\" (UniqueName: \"kubernetes.io/projected/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-kube-api-access-68j8h\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:30 crc kubenswrapper[4852]: I1210 13:00:30.950285 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" (UID: "86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 13:00:30 crc kubenswrapper[4852]: I1210 13:00:30.998062 4852 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:31 crc kubenswrapper[4852]: I1210 13:00:31.286192 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-24l7j_must-gather-h899f_86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f/copy/0.log" Dec 10 13:00:31 crc kubenswrapper[4852]: I1210 13:00:31.286685 4852 scope.go:117] "RemoveContainer" containerID="f1350aad3f798dd717c06816140a8c8d3a9c7df8a61aa6e74a8084c2714e0662" Dec 10 13:00:31 crc kubenswrapper[4852]: I1210 13:00:31.286768 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24l7j/must-gather-h899f" Dec 10 13:00:31 crc kubenswrapper[4852]: I1210 13:00:31.596777 4852 scope.go:117] "RemoveContainer" containerID="d8d3fc60cfaabdc7b1ccf9642e47aa5c5ab68e2f66fa10ecaeb1859856f51960" Dec 10 13:00:32 crc kubenswrapper[4852]: I1210 13:00:32.179092 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" path="/var/lib/kubelet/pods/86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f/volumes" Dec 10 13:00:32 crc kubenswrapper[4852]: I1210 13:00:32.299288 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2btlg" event={"ID":"b94f86ba-a27e-4278-87e1-afbe2e73df15","Type":"ContainerStarted","Data":"7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7"} Dec 10 13:00:32 crc kubenswrapper[4852]: I1210 13:00:32.302711 4852 generic.go:334] "Generic (PLEG): container finished" podID="69e59a92-b2e5-41c5-ba51-6d8a67b08da1" containerID="547c884ed61faa34b1f40225b35d9f10c350a4e7c5b5cc79429bcbb02755b69e" exitCode=0 Dec 10 13:00:32 crc kubenswrapper[4852]: I1210 13:00:32.302758 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk9cn" event={"ID":"69e59a92-b2e5-41c5-ba51-6d8a67b08da1","Type":"ContainerDied","Data":"547c884ed61faa34b1f40225b35d9f10c350a4e7c5b5cc79429bcbb02755b69e"} Dec 10 13:00:33 crc kubenswrapper[4852]: I1210 13:00:33.315932 4852 generic.go:334] "Generic (PLEG): container finished" podID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerID="7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7" exitCode=0 Dec 10 13:00:33 crc kubenswrapper[4852]: I1210 13:00:33.315999 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2btlg" event={"ID":"b94f86ba-a27e-4278-87e1-afbe2e73df15","Type":"ContainerDied","Data":"7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7"} Dec 10 13:00:34 crc kubenswrapper[4852]: I1210 13:00:34.325324 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk9cn" event={"ID":"69e59a92-b2e5-41c5-ba51-6d8a67b08da1","Type":"ContainerStarted","Data":"8145a11c110d20b37a4c20f6a0e18f00a5712d2fc29b7a5c415deedfe299d99e"} Dec 10 13:00:34 crc kubenswrapper[4852]: I1210 13:00:34.350711 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xk9cn" podStartSLOduration=2.909108383 podStartE2EDuration="11.350688277s" podCreationTimestamp="2025-12-10 13:00:23 +0000 UTC" firstStartedPulling="2025-12-10 13:00:25.21601685 +0000 UTC m=+4111.301542074" lastFinishedPulling="2025-12-10 13:00:33.657596744 +0000 UTC m=+4119.743121968" observedRunningTime="2025-12-10 13:00:34.340963593 +0000 UTC m=+4120.426488817" watchObservedRunningTime="2025-12-10 13:00:34.350688277 +0000 UTC m=+4120.436213511" Dec 10 13:00:35 crc kubenswrapper[4852]: I1210 13:00:35.341995 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2btlg" event={"ID":"b94f86ba-a27e-4278-87e1-afbe2e73df15","Type":"ContainerStarted","Data":"5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159"} Dec 10 13:00:35 crc kubenswrapper[4852]: I1210 13:00:35.369573 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2btlg" podStartSLOduration=3.423959403 podStartE2EDuration="10.369550145s" podCreationTimestamp="2025-12-10 13:00:25 +0000 UTC" firstStartedPulling="2025-12-10 13:00:27.239422567 +0000 UTC m=+4113.324947781" lastFinishedPulling="2025-12-10 13:00:34.185013279 +0000 UTC m=+4120.270538523" observedRunningTime="2025-12-10 13:00:35.364042517 +0000 UTC m=+4121.449567741" watchObservedRunningTime="2025-12-10 13:00:35.369550145 +0000 UTC m=+4121.455075379" Dec 10 13:00:35 crc kubenswrapper[4852]: I1210 13:00:35.661841 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:35 crc kubenswrapper[4852]: I1210 13:00:35.662008 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:37 crc kubenswrapper[4852]: I1210 13:00:37.304847 4852 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2btlg" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="registry-server" probeResult="failure" output=< Dec 10 13:00:37 crc kubenswrapper[4852]: timeout: failed to connect service ":50051" within 1s Dec 10 13:00:37 crc kubenswrapper[4852]: > Dec 10 13:00:38 crc kubenswrapper[4852]: I1210 13:00:38.170707 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 13:00:38 crc kubenswrapper[4852]: E1210 13:00:38.171310 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:00:43 crc kubenswrapper[4852]: I1210 13:00:43.685064 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:43 crc kubenswrapper[4852]: I1210 13:00:43.685842 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:43 crc kubenswrapper[4852]: I1210 13:00:43.771939 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:45 crc kubenswrapper[4852]: I1210 13:00:45.147085 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xk9cn" Dec 10 13:00:45 crc kubenswrapper[4852]: I1210 13:00:45.232347 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xk9cn"] Dec 10 13:00:45 crc kubenswrapper[4852]: I1210 13:00:45.284417 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5hzpk"] Dec 10 13:00:45 crc kubenswrapper[4852]: I1210 13:00:45.284971 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5hzpk" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="registry-server" containerID="cri-o://00a8ee84414d06ebc9ce391f216a8b84b14392d6af85268353c96a346f29093f" gracePeriod=2 Dec 10 13:00:45 crc kubenswrapper[4852]: I1210 13:00:45.720954 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:45 crc kubenswrapper[4852]: I1210 13:00:45.775314 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:46 crc kubenswrapper[4852]: I1210 13:00:46.462607 4852 generic.go:334] "Generic (PLEG): container finished" podID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerID="00a8ee84414d06ebc9ce391f216a8b84b14392d6af85268353c96a346f29093f" exitCode=0 Dec 10 13:00:46 crc kubenswrapper[4852]: I1210 13:00:46.462733 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hzpk" event={"ID":"8ad77dbc-86d2-4bbc-8312-4529077f52a6","Type":"ContainerDied","Data":"00a8ee84414d06ebc9ce391f216a8b84b14392d6af85268353c96a346f29093f"} Dec 10 13:00:46 crc kubenswrapper[4852]: I1210 13:00:46.861142 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hzpk" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.032589 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-utilities\") pod \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.032742 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-catalog-content\") pod \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.032950 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st7lw\" (UniqueName: \"kubernetes.io/projected/8ad77dbc-86d2-4bbc-8312-4529077f52a6-kube-api-access-st7lw\") pod \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\" (UID: \"8ad77dbc-86d2-4bbc-8312-4529077f52a6\") " Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.033642 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-utilities" (OuterVolumeSpecName: "utilities") pod "8ad77dbc-86d2-4bbc-8312-4529077f52a6" (UID: "8ad77dbc-86d2-4bbc-8312-4529077f52a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.033931 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.039715 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad77dbc-86d2-4bbc-8312-4529077f52a6-kube-api-access-st7lw" (OuterVolumeSpecName: "kube-api-access-st7lw") pod "8ad77dbc-86d2-4bbc-8312-4529077f52a6" (UID: "8ad77dbc-86d2-4bbc-8312-4529077f52a6"). InnerVolumeSpecName "kube-api-access-st7lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.107790 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ad77dbc-86d2-4bbc-8312-4529077f52a6" (UID: "8ad77dbc-86d2-4bbc-8312-4529077f52a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.136135 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st7lw\" (UniqueName: \"kubernetes.io/projected/8ad77dbc-86d2-4bbc-8312-4529077f52a6-kube-api-access-st7lw\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.136197 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad77dbc-86d2-4bbc-8312-4529077f52a6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.474795 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hzpk" event={"ID":"8ad77dbc-86d2-4bbc-8312-4529077f52a6","Type":"ContainerDied","Data":"84b991bb12ac88b8b4d8e4b407a4f4f4652cb08c304d9ff76ffdd3d40d7a06c1"} Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.475115 4852 scope.go:117] "RemoveContainer" containerID="00a8ee84414d06ebc9ce391f216a8b84b14392d6af85268353c96a346f29093f" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.475294 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hzpk" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.518614 4852 scope.go:117] "RemoveContainer" containerID="52f2ce02c159f10ea2500d439f1daf0d6832450de3ed16ae251b11c99cf525a8" Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.526707 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5hzpk"] Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.536909 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5hzpk"] Dec 10 13:00:47 crc kubenswrapper[4852]: I1210 13:00:47.542776 4852 scope.go:117] "RemoveContainer" containerID="bb87a903ac3b40e866fe731124acdef655eadf68094f8cc7cfb4868f72b7ddf1" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.001632 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2btlg"] Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.001886 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2btlg" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="registry-server" containerID="cri-o://5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159" gracePeriod=2 Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.183424 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" path="/var/lib/kubelet/pods/8ad77dbc-86d2-4bbc-8312-4529077f52a6/volumes" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.476928 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.489324 4852 generic.go:334] "Generic (PLEG): container finished" podID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerID="5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159" exitCode=0 Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.489394 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2btlg" event={"ID":"b94f86ba-a27e-4278-87e1-afbe2e73df15","Type":"ContainerDied","Data":"5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159"} Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.489434 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2btlg" event={"ID":"b94f86ba-a27e-4278-87e1-afbe2e73df15","Type":"ContainerDied","Data":"b9fb3c5c0d4a620bbb68b30ebdf2ff068eff05683750dbab1afbd63604521f1f"} Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.489457 4852 scope.go:117] "RemoveContainer" containerID="5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.489398 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2btlg" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.519671 4852 scope.go:117] "RemoveContainer" containerID="7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.550417 4852 scope.go:117] "RemoveContainer" containerID="cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.583756 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-utilities\") pod \"b94f86ba-a27e-4278-87e1-afbe2e73df15\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.583817 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcqlr\" (UniqueName: \"kubernetes.io/projected/b94f86ba-a27e-4278-87e1-afbe2e73df15-kube-api-access-jcqlr\") pod \"b94f86ba-a27e-4278-87e1-afbe2e73df15\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.584013 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-catalog-content\") pod \"b94f86ba-a27e-4278-87e1-afbe2e73df15\" (UID: \"b94f86ba-a27e-4278-87e1-afbe2e73df15\") " Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.584315 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-utilities" (OuterVolumeSpecName: "utilities") pod "b94f86ba-a27e-4278-87e1-afbe2e73df15" (UID: "b94f86ba-a27e-4278-87e1-afbe2e73df15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.584645 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.592552 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b94f86ba-a27e-4278-87e1-afbe2e73df15-kube-api-access-jcqlr" (OuterVolumeSpecName: "kube-api-access-jcqlr") pod "b94f86ba-a27e-4278-87e1-afbe2e73df15" (UID: "b94f86ba-a27e-4278-87e1-afbe2e73df15"). InnerVolumeSpecName "kube-api-access-jcqlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.602891 4852 scope.go:117] "RemoveContainer" containerID="5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159" Dec 10 13:00:48 crc kubenswrapper[4852]: E1210 13:00:48.603306 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159\": container with ID starting with 5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159 not found: ID does not exist" containerID="5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.603335 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159"} err="failed to get container status \"5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159\": rpc error: code = NotFound desc = could not find container \"5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159\": container with ID starting with 5cfa006a2dfef8ff2ea4eec2422948c1652103cff7247db26f9211b705380159 not found: ID does not exist" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.603356 4852 scope.go:117] "RemoveContainer" containerID="7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7" Dec 10 13:00:48 crc kubenswrapper[4852]: E1210 13:00:48.603676 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7\": container with ID starting with 7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7 not found: ID does not exist" containerID="7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.603727 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7"} err="failed to get container status \"7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7\": rpc error: code = NotFound desc = could not find container \"7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7\": container with ID starting with 7ba72a2df90700d0d5376b619f617901a08bb49205b0b7b1f211462db47341e7 not found: ID does not exist" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.603755 4852 scope.go:117] "RemoveContainer" containerID="cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e" Dec 10 13:00:48 crc kubenswrapper[4852]: E1210 13:00:48.606149 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e\": container with ID starting with cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e not found: ID does not exist" containerID="cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.606183 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e"} err="failed to get container status \"cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e\": rpc error: code = NotFound desc = could not find container \"cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e\": container with ID starting with cf263efd761ae98a74b098eacd324777bb3240987e503b2a699c2dc954a7f62e not found: ID does not exist" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.688704 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcqlr\" (UniqueName: \"kubernetes.io/projected/b94f86ba-a27e-4278-87e1-afbe2e73df15-kube-api-access-jcqlr\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.715182 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b94f86ba-a27e-4278-87e1-afbe2e73df15" (UID: "b94f86ba-a27e-4278-87e1-afbe2e73df15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.790659 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b94f86ba-a27e-4278-87e1-afbe2e73df15-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.824557 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2btlg"] Dec 10 13:00:48 crc kubenswrapper[4852]: I1210 13:00:48.836294 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2btlg"] Dec 10 13:00:50 crc kubenswrapper[4852]: I1210 13:00:50.187335 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" path="/var/lib/kubelet/pods/b94f86ba-a27e-4278-87e1-afbe2e73df15/volumes" Dec 10 13:00:51 crc kubenswrapper[4852]: I1210 13:00:51.172366 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 13:00:51 crc kubenswrapper[4852]: E1210 13:00:51.175252 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.155195 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29422861-sdlqm"] Dec 10 13:01:00 crc kubenswrapper[4852]: E1210 13:01:00.156477 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="registry-server" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156502 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="registry-server" Dec 10 13:01:00 crc kubenswrapper[4852]: E1210 13:01:00.156531 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="extract-utilities" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156539 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="extract-utilities" Dec 10 13:01:00 crc kubenswrapper[4852]: E1210 13:01:00.156546 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="extract-content" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156552 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="extract-content" Dec 10 13:01:00 crc kubenswrapper[4852]: E1210 13:01:00.156561 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="extract-content" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156567 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="extract-content" Dec 10 13:01:00 crc kubenswrapper[4852]: E1210 13:01:00.156584 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="registry-server" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156590 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="registry-server" Dec 10 13:01:00 crc kubenswrapper[4852]: E1210 13:01:00.156605 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerName="gather" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156610 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerName="gather" Dec 10 13:01:00 crc kubenswrapper[4852]: E1210 13:01:00.156624 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerName="copy" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156630 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerName="copy" Dec 10 13:01:00 crc kubenswrapper[4852]: E1210 13:01:00.156638 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="extract-utilities" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156643 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="extract-utilities" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156816 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerName="copy" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156829 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="b94f86ba-a27e-4278-87e1-afbe2e73df15" containerName="registry-server" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156845 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad77dbc-86d2-4bbc-8312-4529077f52a6" containerName="registry-server" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.156862 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f75cc3-6c1f-4a3a-8c8e-5a3ede0d5d6f" containerName="gather" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.158166 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.167093 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29422861-sdlqm"] Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.247218 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btm4m\" (UniqueName: \"kubernetes.io/projected/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-kube-api-access-btm4m\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.247288 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-fernet-keys\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.247353 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-config-data\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.247415 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-combined-ca-bundle\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.350311 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btm4m\" (UniqueName: \"kubernetes.io/projected/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-kube-api-access-btm4m\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.350581 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-fernet-keys\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.350644 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-config-data\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.350689 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-combined-ca-bundle\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.359256 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-fernet-keys\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.361138 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-combined-ca-bundle\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.402178 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-config-data\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.409627 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btm4m\" (UniqueName: \"kubernetes.io/projected/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-kube-api-access-btm4m\") pod \"keystone-cron-29422861-sdlqm\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.566977 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:00 crc kubenswrapper[4852]: I1210 13:01:00.901874 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29422861-sdlqm"] Dec 10 13:01:01 crc kubenswrapper[4852]: I1210 13:01:01.642645 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29422861-sdlqm" event={"ID":"c1fe60b9-3c1d-4891-bc97-85ac60b6a494","Type":"ContainerStarted","Data":"7ba8586a8eb1321caa2976b3d83b1d29884e4d3d33f8cfafcef8a85c5ede9d50"} Dec 10 13:01:01 crc kubenswrapper[4852]: I1210 13:01:01.644485 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29422861-sdlqm" event={"ID":"c1fe60b9-3c1d-4891-bc97-85ac60b6a494","Type":"ContainerStarted","Data":"a4376dd711c20edbb2ccd9164d45adf6616ad7117ddbda1ef82f3d93afae9b95"} Dec 10 13:01:01 crc kubenswrapper[4852]: I1210 13:01:01.682743 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29422861-sdlqm" podStartSLOduration=1.682726623 podStartE2EDuration="1.682726623s" podCreationTimestamp="2025-12-10 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 13:01:01.675981553 +0000 UTC m=+4147.761506777" watchObservedRunningTime="2025-12-10 13:01:01.682726623 +0000 UTC m=+4147.768251847" Dec 10 13:01:04 crc kubenswrapper[4852]: I1210 13:01:04.681848 4852 generic.go:334] "Generic (PLEG): container finished" podID="c1fe60b9-3c1d-4891-bc97-85ac60b6a494" containerID="7ba8586a8eb1321caa2976b3d83b1d29884e4d3d33f8cfafcef8a85c5ede9d50" exitCode=0 Dec 10 13:01:04 crc kubenswrapper[4852]: I1210 13:01:04.681907 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29422861-sdlqm" event={"ID":"c1fe60b9-3c1d-4891-bc97-85ac60b6a494","Type":"ContainerDied","Data":"7ba8586a8eb1321caa2976b3d83b1d29884e4d3d33f8cfafcef8a85c5ede9d50"} Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.124488 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.172412 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 13:01:06 crc kubenswrapper[4852]: E1210 13:01:06.172663 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.291459 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-config-data\") pod \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.291597 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-fernet-keys\") pod \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.291631 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btm4m\" (UniqueName: \"kubernetes.io/projected/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-kube-api-access-btm4m\") pod \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.291714 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-combined-ca-bundle\") pod \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\" (UID: \"c1fe60b9-3c1d-4891-bc97-85ac60b6a494\") " Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.298858 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c1fe60b9-3c1d-4891-bc97-85ac60b6a494" (UID: "c1fe60b9-3c1d-4891-bc97-85ac60b6a494"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.305439 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-kube-api-access-btm4m" (OuterVolumeSpecName: "kube-api-access-btm4m") pod "c1fe60b9-3c1d-4891-bc97-85ac60b6a494" (UID: "c1fe60b9-3c1d-4891-bc97-85ac60b6a494"). InnerVolumeSpecName "kube-api-access-btm4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.328129 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1fe60b9-3c1d-4891-bc97-85ac60b6a494" (UID: "c1fe60b9-3c1d-4891-bc97-85ac60b6a494"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.349889 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-config-data" (OuterVolumeSpecName: "config-data") pod "c1fe60b9-3c1d-4891-bc97-85ac60b6a494" (UID: "c1fe60b9-3c1d-4891-bc97-85ac60b6a494"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.394464 4852 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-config-data\") on node \"crc\" DevicePath \"\"" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.394506 4852 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.394518 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btm4m\" (UniqueName: \"kubernetes.io/projected/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-kube-api-access-btm4m\") on node \"crc\" DevicePath \"\"" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.394532 4852 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fe60b9-3c1d-4891-bc97-85ac60b6a494-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.706350 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29422861-sdlqm" event={"ID":"c1fe60b9-3c1d-4891-bc97-85ac60b6a494","Type":"ContainerDied","Data":"a4376dd711c20edbb2ccd9164d45adf6616ad7117ddbda1ef82f3d93afae9b95"} Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.706398 4852 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4376dd711c20edbb2ccd9164d45adf6616ad7117ddbda1ef82f3d93afae9b95" Dec 10 13:01:06 crc kubenswrapper[4852]: I1210 13:01:06.706465 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29422861-sdlqm" Dec 10 13:01:19 crc kubenswrapper[4852]: I1210 13:01:19.170511 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 13:01:19 crc kubenswrapper[4852]: I1210 13:01:19.860806 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"820fa6e8de38173f81ad2e38f1b477307381a4e192369f3e841bf38ca54208f0"} Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.666560 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vsjbn/must-gather-nmn5w"] Dec 10 13:03:25 crc kubenswrapper[4852]: E1210 13:03:25.667597 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fe60b9-3c1d-4891-bc97-85ac60b6a494" containerName="keystone-cron" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.667614 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fe60b9-3c1d-4891-bc97-85ac60b6a494" containerName="keystone-cron" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.667870 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1fe60b9-3c1d-4891-bc97-85ac60b6a494" containerName="keystone-cron" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.669151 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.687900 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vsjbn"/"openshift-service-ca.crt" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.691243 4852 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vsjbn"/"default-dockercfg-lqc5k" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.692611 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vsjbn/must-gather-nmn5w"] Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.694005 4852 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vsjbn"/"kube-root-ca.crt" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.777853 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sppw9\" (UniqueName: \"kubernetes.io/projected/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-kube-api-access-sppw9\") pod \"must-gather-nmn5w\" (UID: \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\") " pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.778026 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-must-gather-output\") pod \"must-gather-nmn5w\" (UID: \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\") " pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.879888 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-must-gather-output\") pod \"must-gather-nmn5w\" (UID: \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\") " pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.879942 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sppw9\" (UniqueName: \"kubernetes.io/projected/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-kube-api-access-sppw9\") pod \"must-gather-nmn5w\" (UID: \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\") " pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.880419 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-must-gather-output\") pod \"must-gather-nmn5w\" (UID: \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\") " pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:03:25 crc kubenswrapper[4852]: I1210 13:03:25.903510 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sppw9\" (UniqueName: \"kubernetes.io/projected/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-kube-api-access-sppw9\") pod \"must-gather-nmn5w\" (UID: \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\") " pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:03:26 crc kubenswrapper[4852]: I1210 13:03:26.010223 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:03:26 crc kubenswrapper[4852]: I1210 13:03:26.645928 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vsjbn/must-gather-nmn5w"] Dec 10 13:03:26 crc kubenswrapper[4852]: W1210 13:03:26.649537 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab8dd8ed_b5dc_42b8_a2e5_69cd10671053.slice/crio-5cc951427b66cd8f90b6434fe90f5da8550f34fcc8ef3043ae03ece14d2c910a WatchSource:0}: Error finding container 5cc951427b66cd8f90b6434fe90f5da8550f34fcc8ef3043ae03ece14d2c910a: Status 404 returned error can't find the container with id 5cc951427b66cd8f90b6434fe90f5da8550f34fcc8ef3043ae03ece14d2c910a Dec 10 13:03:27 crc kubenswrapper[4852]: I1210 13:03:27.136559 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" event={"ID":"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053","Type":"ContainerStarted","Data":"5cc951427b66cd8f90b6434fe90f5da8550f34fcc8ef3043ae03ece14d2c910a"} Dec 10 13:03:29 crc kubenswrapper[4852]: I1210 13:03:29.164712 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" event={"ID":"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053","Type":"ContainerStarted","Data":"e15d72cd273d0d0a28cb254d5a55a05efa68cf52be151786cf81367aeff496f2"} Dec 10 13:03:30 crc kubenswrapper[4852]: I1210 13:03:30.200616 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" event={"ID":"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053","Type":"ContainerStarted","Data":"03e2920ffc888b5bfb93be60dd3dbe221f1a8e81bb308a65f149342c98350342"} Dec 10 13:03:30 crc kubenswrapper[4852]: I1210 13:03:30.231180 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" podStartSLOduration=5.23115536 podStartE2EDuration="5.23115536s" podCreationTimestamp="2025-12-10 13:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 13:03:30.220872808 +0000 UTC m=+4296.306398032" watchObservedRunningTime="2025-12-10 13:03:30.23115536 +0000 UTC m=+4296.316680604" Dec 10 13:03:32 crc kubenswrapper[4852]: I1210 13:03:32.737556 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-xklxs"] Dec 10 13:03:32 crc kubenswrapper[4852]: I1210 13:03:32.739124 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:03:32 crc kubenswrapper[4852]: I1210 13:03:32.863060 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64aca9a5-a0a2-4f8d-984b-e33a644cd691-host\") pod \"crc-debug-xklxs\" (UID: \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\") " pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:03:32 crc kubenswrapper[4852]: I1210 13:03:32.863362 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62v9h\" (UniqueName: \"kubernetes.io/projected/64aca9a5-a0a2-4f8d-984b-e33a644cd691-kube-api-access-62v9h\") pod \"crc-debug-xklxs\" (UID: \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\") " pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:03:32 crc kubenswrapper[4852]: I1210 13:03:32.964885 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64aca9a5-a0a2-4f8d-984b-e33a644cd691-host\") pod \"crc-debug-xklxs\" (UID: \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\") " pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:03:32 crc kubenswrapper[4852]: I1210 13:03:32.964985 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62v9h\" (UniqueName: \"kubernetes.io/projected/64aca9a5-a0a2-4f8d-984b-e33a644cd691-kube-api-access-62v9h\") pod \"crc-debug-xklxs\" (UID: \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\") " pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:03:32 crc kubenswrapper[4852]: I1210 13:03:32.965046 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64aca9a5-a0a2-4f8d-984b-e33a644cd691-host\") pod \"crc-debug-xklxs\" (UID: \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\") " pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:03:33 crc kubenswrapper[4852]: I1210 13:03:33.172410 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62v9h\" (UniqueName: \"kubernetes.io/projected/64aca9a5-a0a2-4f8d-984b-e33a644cd691-kube-api-access-62v9h\") pod \"crc-debug-xklxs\" (UID: \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\") " pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:03:33 crc kubenswrapper[4852]: I1210 13:03:33.359555 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:03:33 crc kubenswrapper[4852]: W1210 13:03:33.398852 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64aca9a5_a0a2_4f8d_984b_e33a644cd691.slice/crio-387483a4c9597c09e229ae878aa2b70021236bd7354905c0f7191d97b32b29a1 WatchSource:0}: Error finding container 387483a4c9597c09e229ae878aa2b70021236bd7354905c0f7191d97b32b29a1: Status 404 returned error can't find the container with id 387483a4c9597c09e229ae878aa2b70021236bd7354905c0f7191d97b32b29a1 Dec 10 13:03:34 crc kubenswrapper[4852]: I1210 13:03:34.235730 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/crc-debug-xklxs" event={"ID":"64aca9a5-a0a2-4f8d-984b-e33a644cd691","Type":"ContainerStarted","Data":"387483a4c9597c09e229ae878aa2b70021236bd7354905c0f7191d97b32b29a1"} Dec 10 13:03:35 crc kubenswrapper[4852]: I1210 13:03:35.245635 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/crc-debug-xklxs" event={"ID":"64aca9a5-a0a2-4f8d-984b-e33a644cd691","Type":"ContainerStarted","Data":"aa794622b715a913358df6cd5696ba0598b7c27c76e36edbd9943b2eb028b42f"} Dec 10 13:03:35 crc kubenswrapper[4852]: I1210 13:03:35.266442 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vsjbn/crc-debug-xklxs" podStartSLOduration=3.266418757 podStartE2EDuration="3.266418757s" podCreationTimestamp="2025-12-10 13:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 13:03:35.259782714 +0000 UTC m=+4301.345307948" watchObservedRunningTime="2025-12-10 13:03:35.266418757 +0000 UTC m=+4301.351943981" Dec 10 13:03:45 crc kubenswrapper[4852]: I1210 13:03:45.790548 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 13:03:45 crc kubenswrapper[4852]: I1210 13:03:45.791149 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 13:04:10 crc kubenswrapper[4852]: I1210 13:04:10.545882 4852 generic.go:334] "Generic (PLEG): container finished" podID="64aca9a5-a0a2-4f8d-984b-e33a644cd691" containerID="aa794622b715a913358df6cd5696ba0598b7c27c76e36edbd9943b2eb028b42f" exitCode=0 Dec 10 13:04:10 crc kubenswrapper[4852]: I1210 13:04:10.545975 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/crc-debug-xklxs" event={"ID":"64aca9a5-a0a2-4f8d-984b-e33a644cd691","Type":"ContainerDied","Data":"aa794622b715a913358df6cd5696ba0598b7c27c76e36edbd9943b2eb028b42f"} Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.693685 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.729374 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-xklxs"] Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.736716 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-xklxs"] Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.793398 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64aca9a5-a0a2-4f8d-984b-e33a644cd691-host\") pod \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\" (UID: \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\") " Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.793477 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62v9h\" (UniqueName: \"kubernetes.io/projected/64aca9a5-a0a2-4f8d-984b-e33a644cd691-kube-api-access-62v9h\") pod \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\" (UID: \"64aca9a5-a0a2-4f8d-984b-e33a644cd691\") " Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.793485 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64aca9a5-a0a2-4f8d-984b-e33a644cd691-host" (OuterVolumeSpecName: "host") pod "64aca9a5-a0a2-4f8d-984b-e33a644cd691" (UID: "64aca9a5-a0a2-4f8d-984b-e33a644cd691"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.794286 4852 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64aca9a5-a0a2-4f8d-984b-e33a644cd691-host\") on node \"crc\" DevicePath \"\"" Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.800600 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64aca9a5-a0a2-4f8d-984b-e33a644cd691-kube-api-access-62v9h" (OuterVolumeSpecName: "kube-api-access-62v9h") pod "64aca9a5-a0a2-4f8d-984b-e33a644cd691" (UID: "64aca9a5-a0a2-4f8d-984b-e33a644cd691"). InnerVolumeSpecName "kube-api-access-62v9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:04:11 crc kubenswrapper[4852]: I1210 13:04:11.895861 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62v9h\" (UniqueName: \"kubernetes.io/projected/64aca9a5-a0a2-4f8d-984b-e33a644cd691-kube-api-access-62v9h\") on node \"crc\" DevicePath \"\"" Dec 10 13:04:12 crc kubenswrapper[4852]: I1210 13:04:12.182451 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64aca9a5-a0a2-4f8d-984b-e33a644cd691" path="/var/lib/kubelet/pods/64aca9a5-a0a2-4f8d-984b-e33a644cd691/volumes" Dec 10 13:04:12 crc kubenswrapper[4852]: I1210 13:04:12.581378 4852 scope.go:117] "RemoveContainer" containerID="aa794622b715a913358df6cd5696ba0598b7c27c76e36edbd9943b2eb028b42f" Dec 10 13:04:12 crc kubenswrapper[4852]: I1210 13:04:12.581477 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-xklxs" Dec 10 13:04:12 crc kubenswrapper[4852]: I1210 13:04:12.941980 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-hcpcj"] Dec 10 13:04:12 crc kubenswrapper[4852]: E1210 13:04:12.942456 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64aca9a5-a0a2-4f8d-984b-e33a644cd691" containerName="container-00" Dec 10 13:04:12 crc kubenswrapper[4852]: I1210 13:04:12.942472 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="64aca9a5-a0a2-4f8d-984b-e33a644cd691" containerName="container-00" Dec 10 13:04:12 crc kubenswrapper[4852]: I1210 13:04:12.942701 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="64aca9a5-a0a2-4f8d-984b-e33a644cd691" containerName="container-00" Dec 10 13:04:12 crc kubenswrapper[4852]: I1210 13:04:12.943454 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.117588 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-host\") pod \"crc-debug-hcpcj\" (UID: \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\") " pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.117799 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsdp4\" (UniqueName: \"kubernetes.io/projected/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-kube-api-access-hsdp4\") pod \"crc-debug-hcpcj\" (UID: \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\") " pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.219879 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-host\") pod \"crc-debug-hcpcj\" (UID: \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\") " pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.219977 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsdp4\" (UniqueName: \"kubernetes.io/projected/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-kube-api-access-hsdp4\") pod \"crc-debug-hcpcj\" (UID: \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\") " pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.219995 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-host\") pod \"crc-debug-hcpcj\" (UID: \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\") " pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.253998 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsdp4\" (UniqueName: \"kubernetes.io/projected/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-kube-api-access-hsdp4\") pod \"crc-debug-hcpcj\" (UID: \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\") " pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.262224 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.601442 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" event={"ID":"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537","Type":"ContainerStarted","Data":"495d1d10cc259ccbf41471d4ee0609e37e284fe20185d2ac929311a1c6c58b60"} Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.601852 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" event={"ID":"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537","Type":"ContainerStarted","Data":"04a71cd7a20a067b371473f1b1a4e1ccb6d5fe29d8b9516bcdfff6911865c5c5"} Dec 10 13:04:13 crc kubenswrapper[4852]: I1210 13:04:13.616970 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" podStartSLOduration=1.616947591 podStartE2EDuration="1.616947591s" podCreationTimestamp="2025-12-10 13:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-10 13:04:13.615459775 +0000 UTC m=+4339.700985009" watchObservedRunningTime="2025-12-10 13:04:13.616947591 +0000 UTC m=+4339.702472815" Dec 10 13:04:14 crc kubenswrapper[4852]: I1210 13:04:14.613698 4852 generic.go:334] "Generic (PLEG): container finished" podID="f3b6a8cd-f91c-44f5-a33c-ff71e1b63537" containerID="495d1d10cc259ccbf41471d4ee0609e37e284fe20185d2ac929311a1c6c58b60" exitCode=0 Dec 10 13:04:14 crc kubenswrapper[4852]: I1210 13:04:14.613745 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" event={"ID":"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537","Type":"ContainerDied","Data":"495d1d10cc259ccbf41471d4ee0609e37e284fe20185d2ac929311a1c6c58b60"} Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.721184 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.762948 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-hcpcj"] Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.773819 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-hcpcj"] Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.790244 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.790306 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.882649 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsdp4\" (UniqueName: \"kubernetes.io/projected/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-kube-api-access-hsdp4\") pod \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\" (UID: \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\") " Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.882832 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-host\") pod \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\" (UID: \"f3b6a8cd-f91c-44f5-a33c-ff71e1b63537\") " Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.882996 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-host" (OuterVolumeSpecName: "host") pod "f3b6a8cd-f91c-44f5-a33c-ff71e1b63537" (UID: "f3b6a8cd-f91c-44f5-a33c-ff71e1b63537"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.883288 4852 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-host\") on node \"crc\" DevicePath \"\"" Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.888989 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-kube-api-access-hsdp4" (OuterVolumeSpecName: "kube-api-access-hsdp4") pod "f3b6a8cd-f91c-44f5-a33c-ff71e1b63537" (UID: "f3b6a8cd-f91c-44f5-a33c-ff71e1b63537"). InnerVolumeSpecName "kube-api-access-hsdp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:04:15 crc kubenswrapper[4852]: I1210 13:04:15.985327 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsdp4\" (UniqueName: \"kubernetes.io/projected/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537-kube-api-access-hsdp4\") on node \"crc\" DevicePath \"\"" Dec 10 13:04:16 crc kubenswrapper[4852]: I1210 13:04:16.183468 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b6a8cd-f91c-44f5-a33c-ff71e1b63537" path="/var/lib/kubelet/pods/f3b6a8cd-f91c-44f5-a33c-ff71e1b63537/volumes" Dec 10 13:04:16 crc kubenswrapper[4852]: I1210 13:04:16.632917 4852 scope.go:117] "RemoveContainer" containerID="495d1d10cc259ccbf41471d4ee0609e37e284fe20185d2ac929311a1c6c58b60" Dec 10 13:04:16 crc kubenswrapper[4852]: I1210 13:04:16.633000 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-hcpcj" Dec 10 13:04:16 crc kubenswrapper[4852]: I1210 13:04:16.960487 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-rmm2q"] Dec 10 13:04:16 crc kubenswrapper[4852]: E1210 13:04:16.961815 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b6a8cd-f91c-44f5-a33c-ff71e1b63537" containerName="container-00" Dec 10 13:04:16 crc kubenswrapper[4852]: I1210 13:04:16.961896 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b6a8cd-f91c-44f5-a33c-ff71e1b63537" containerName="container-00" Dec 10 13:04:16 crc kubenswrapper[4852]: I1210 13:04:16.962121 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b6a8cd-f91c-44f5-a33c-ff71e1b63537" containerName="container-00" Dec 10 13:04:16 crc kubenswrapper[4852]: I1210 13:04:16.962774 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.134531 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnt9r\" (UniqueName: \"kubernetes.io/projected/a4b658dc-e110-41e0-a9e3-415f53f7a84f-kube-api-access-fnt9r\") pod \"crc-debug-rmm2q\" (UID: \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\") " pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.134643 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b658dc-e110-41e0-a9e3-415f53f7a84f-host\") pod \"crc-debug-rmm2q\" (UID: \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\") " pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.236290 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnt9r\" (UniqueName: \"kubernetes.io/projected/a4b658dc-e110-41e0-a9e3-415f53f7a84f-kube-api-access-fnt9r\") pod \"crc-debug-rmm2q\" (UID: \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\") " pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.236783 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b658dc-e110-41e0-a9e3-415f53f7a84f-host\") pod \"crc-debug-rmm2q\" (UID: \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\") " pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.236936 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b658dc-e110-41e0-a9e3-415f53f7a84f-host\") pod \"crc-debug-rmm2q\" (UID: \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\") " pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.257667 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnt9r\" (UniqueName: \"kubernetes.io/projected/a4b658dc-e110-41e0-a9e3-415f53f7a84f-kube-api-access-fnt9r\") pod \"crc-debug-rmm2q\" (UID: \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\") " pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.281185 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:17 crc kubenswrapper[4852]: W1210 13:04:17.323374 4852 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4b658dc_e110_41e0_a9e3_415f53f7a84f.slice/crio-566cbdce38e3ca9053959778be5dd2d5a4e18669cacf856d89a226b4d43e7613 WatchSource:0}: Error finding container 566cbdce38e3ca9053959778be5dd2d5a4e18669cacf856d89a226b4d43e7613: Status 404 returned error can't find the container with id 566cbdce38e3ca9053959778be5dd2d5a4e18669cacf856d89a226b4d43e7613 Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.654804 4852 generic.go:334] "Generic (PLEG): container finished" podID="a4b658dc-e110-41e0-a9e3-415f53f7a84f" containerID="e249ef79264d83314584637e5159ac9d4fcc2ed148e70a458138c36c5f262d8f" exitCode=0 Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.654891 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" event={"ID":"a4b658dc-e110-41e0-a9e3-415f53f7a84f","Type":"ContainerDied","Data":"e249ef79264d83314584637e5159ac9d4fcc2ed148e70a458138c36c5f262d8f"} Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.655228 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" event={"ID":"a4b658dc-e110-41e0-a9e3-415f53f7a84f","Type":"ContainerStarted","Data":"566cbdce38e3ca9053959778be5dd2d5a4e18669cacf856d89a226b4d43e7613"} Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.695611 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-rmm2q"] Dec 10 13:04:17 crc kubenswrapper[4852]: I1210 13:04:17.706815 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vsjbn/crc-debug-rmm2q"] Dec 10 13:04:18 crc kubenswrapper[4852]: I1210 13:04:18.761147 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:18 crc kubenswrapper[4852]: I1210 13:04:18.866493 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b658dc-e110-41e0-a9e3-415f53f7a84f-host\") pod \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\" (UID: \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\") " Dec 10 13:04:18 crc kubenswrapper[4852]: I1210 13:04:18.866576 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnt9r\" (UniqueName: \"kubernetes.io/projected/a4b658dc-e110-41e0-a9e3-415f53f7a84f-kube-api-access-fnt9r\") pod \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\" (UID: \"a4b658dc-e110-41e0-a9e3-415f53f7a84f\") " Dec 10 13:04:18 crc kubenswrapper[4852]: I1210 13:04:18.866639 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4b658dc-e110-41e0-a9e3-415f53f7a84f-host" (OuterVolumeSpecName: "host") pod "a4b658dc-e110-41e0-a9e3-415f53f7a84f" (UID: "a4b658dc-e110-41e0-a9e3-415f53f7a84f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 10 13:04:18 crc kubenswrapper[4852]: I1210 13:04:18.867013 4852 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b658dc-e110-41e0-a9e3-415f53f7a84f-host\") on node \"crc\" DevicePath \"\"" Dec 10 13:04:18 crc kubenswrapper[4852]: I1210 13:04:18.875376 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b658dc-e110-41e0-a9e3-415f53f7a84f-kube-api-access-fnt9r" (OuterVolumeSpecName: "kube-api-access-fnt9r") pod "a4b658dc-e110-41e0-a9e3-415f53f7a84f" (UID: "a4b658dc-e110-41e0-a9e3-415f53f7a84f"). InnerVolumeSpecName "kube-api-access-fnt9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:04:18 crc kubenswrapper[4852]: I1210 13:04:18.968399 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnt9r\" (UniqueName: \"kubernetes.io/projected/a4b658dc-e110-41e0-a9e3-415f53f7a84f-kube-api-access-fnt9r\") on node \"crc\" DevicePath \"\"" Dec 10 13:04:19 crc kubenswrapper[4852]: I1210 13:04:19.675723 4852 scope.go:117] "RemoveContainer" containerID="e249ef79264d83314584637e5159ac9d4fcc2ed148e70a458138c36c5f262d8f" Dec 10 13:04:19 crc kubenswrapper[4852]: I1210 13:04:19.675771 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/crc-debug-rmm2q" Dec 10 13:04:20 crc kubenswrapper[4852]: I1210 13:04:20.182146 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b658dc-e110-41e0-a9e3-415f53f7a84f" path="/var/lib/kubelet/pods/a4b658dc-e110-41e0-a9e3-415f53f7a84f/volumes" Dec 10 13:04:40 crc kubenswrapper[4852]: I1210 13:04:40.346265 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b6dbfcbc8-pkb6j_8c77ea6d-6206-4361-8b0f-e8f273666084/barbican-api/0.log" Dec 10 13:04:40 crc kubenswrapper[4852]: I1210 13:04:40.431598 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b6dbfcbc8-pkb6j_8c77ea6d-6206-4361-8b0f-e8f273666084/barbican-api-log/0.log" Dec 10 13:04:40 crc kubenswrapper[4852]: I1210 13:04:40.546415 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-ffd755b9d-ffwqf_67165a10-114d-48b8-9c9b-ce7525e7d98d/barbican-keystone-listener/0.log" Dec 10 13:04:40 crc kubenswrapper[4852]: I1210 13:04:40.600727 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-ffd755b9d-ffwqf_67165a10-114d-48b8-9c9b-ce7525e7d98d/barbican-keystone-listener-log/0.log" Dec 10 13:04:40 crc kubenswrapper[4852]: I1210 13:04:40.649442 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db94ccfb7-vvhtv_cc13355d-4438-440e-bfdf-debe0d6dae5b/barbican-worker/0.log" Dec 10 13:04:40 crc kubenswrapper[4852]: I1210 13:04:40.792888 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db94ccfb7-vvhtv_cc13355d-4438-440e-bfdf-debe0d6dae5b/barbican-worker-log/0.log" Dec 10 13:04:40 crc kubenswrapper[4852]: I1210 13:04:40.827253 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5jwcp_3ed1622b-fe84-4402-b15c-6971dde2a93f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:40 crc kubenswrapper[4852]: I1210 13:04:40.975744 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14750a4b-711e-443e-94aa-670159e43e44/ceilometer-central-agent/0.log" Dec 10 13:04:41 crc kubenswrapper[4852]: I1210 13:04:41.004089 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14750a4b-711e-443e-94aa-670159e43e44/ceilometer-notification-agent/0.log" Dec 10 13:04:41 crc kubenswrapper[4852]: I1210 13:04:41.033342 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14750a4b-711e-443e-94aa-670159e43e44/proxy-httpd/0.log" Dec 10 13:04:41 crc kubenswrapper[4852]: I1210 13:04:41.095823 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14750a4b-711e-443e-94aa-670159e43e44/sg-core/0.log" Dec 10 13:04:41 crc kubenswrapper[4852]: I1210 13:04:41.213223 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3ee70b55-95d5-4ea5-9626-a6482097668c/cinder-api/0.log" Dec 10 13:04:41 crc kubenswrapper[4852]: I1210 13:04:41.218054 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3ee70b55-95d5-4ea5-9626-a6482097668c/cinder-api-log/0.log" Dec 10 13:04:41 crc kubenswrapper[4852]: I1210 13:04:41.471707 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ff8bb370-489b-402e-a532-8dc299fa3aee/cinder-scheduler/0.log" Dec 10 13:04:41 crc kubenswrapper[4852]: I1210 13:04:41.486082 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ff8bb370-489b-402e-a532-8dc299fa3aee/probe/0.log" Dec 10 13:04:41 crc kubenswrapper[4852]: I1210 13:04:41.563582 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qbjh2_5d8bf94c-e162-497e-8f35-6171e96384a3/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.244339 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7mdsf_95d0bf0c-a43a-47e9-bf7e-5bdad23e513e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.285672 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-rl9rt_4e3cbf64-e31a-4f5b-a045-8a3de2cba72b/init/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.492523 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-rl9rt_4e3cbf64-e31a-4f5b-a045-8a3de2cba72b/init/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.570262 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-rl9rt_4e3cbf64-e31a-4f5b-a045-8a3de2cba72b/dnsmasq-dns/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.572192 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wrflm_abde0ae2-8030-4b0c-8e0e-d3aeaaa1af82/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.780054 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0edbc55b-f57a-46c0-9991-33d794c74319/glance-httpd/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.802775 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0edbc55b-f57a-46c0-9991-33d794c74319/glance-log/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.981312 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f7719c76-46f2-456f-8e69-8becce7f3b9c/glance-log/0.log" Dec 10 13:04:42 crc kubenswrapper[4852]: I1210 13:04:42.988758 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f7719c76-46f2-456f-8e69-8becce7f3b9c/glance-httpd/0.log" Dec 10 13:04:43 crc kubenswrapper[4852]: I1210 13:04:43.168348 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-cc955f7d4-bclr7_35b770c5-bcea-4f68-8c5b-fb852f8b97a9/horizon/0.log" Dec 10 13:04:43 crc kubenswrapper[4852]: I1210 13:04:43.252477 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nshgs_ca5421d7-d674-4ead-b580-d8c63cdffb0c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:43 crc kubenswrapper[4852]: I1210 13:04:43.384655 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rh9qn_40acc70f-2b91-4e6e-af47-b525289badc8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:43 crc kubenswrapper[4852]: I1210 13:04:43.521756 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-cc955f7d4-bclr7_35b770c5-bcea-4f68-8c5b-fb852f8b97a9/horizon-log/0.log" Dec 10 13:04:43 crc kubenswrapper[4852]: I1210 13:04:43.570697 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29422861-sdlqm_c1fe60b9-3c1d-4891-bc97-85ac60b6a494/keystone-cron/0.log" Dec 10 13:04:43 crc kubenswrapper[4852]: I1210 13:04:43.641198 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7dd8c6757f-lbdxp_fbe4801a-4ecc-4ecd-b00d-da9917481e2e/keystone-api/0.log" Dec 10 13:04:43 crc kubenswrapper[4852]: I1210 13:04:43.713306 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_20797400-1dd7-4c4b-af50-9f0c839a06c6/kube-state-metrics/0.log" Dec 10 13:04:43 crc kubenswrapper[4852]: I1210 13:04:43.804238 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-jnxmd_2e945b09-b0dd-4c09-9ba2-38cb4d2b3e6f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:44 crc kubenswrapper[4852]: I1210 13:04:44.145584 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9dd466c4f-pgb9f_63687f02-3cc2-4640-88f1-e312bbe550e7/neutron-httpd/0.log" Dec 10 13:04:44 crc kubenswrapper[4852]: I1210 13:04:44.171673 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9dd466c4f-pgb9f_63687f02-3cc2-4640-88f1-e312bbe550e7/neutron-api/0.log" Dec 10 13:04:44 crc kubenswrapper[4852]: I1210 13:04:44.199965 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qn2vh_9d30136b-22e2-4932-9da4-836b2368d7bc/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:44 crc kubenswrapper[4852]: I1210 13:04:44.771899 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9cebd010-0435-40cc-9d60-2359682ee83e/nova-api-log/0.log" Dec 10 13:04:44 crc kubenswrapper[4852]: I1210 13:04:44.864384 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_70c6e25c-cf76-4ec0-9981-5a8dbc98d07e/nova-cell0-conductor-conductor/0.log" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.068744 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a903af04-d97a-42ba-94c4-af5d3c84de08/nova-cell1-conductor-conductor/0.log" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.249443 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5f6d8b73-adeb-47cd-9150-613bda06874e/nova-cell1-novncproxy-novncproxy/0.log" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.277609 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9cebd010-0435-40cc-9d60-2359682ee83e/nova-api-api/0.log" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.317993 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-bjrzh_4d0aea88-1cca-4e75-bc26-15c9f44d8682/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.625520 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3/nova-metadata-log/0.log" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.789597 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.789650 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.789693 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.790402 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"820fa6e8de38173f81ad2e38f1b477307381a4e192369f3e841bf38ca54208f0"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.790449 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://820fa6e8de38173f81ad2e38f1b477307381a4e192369f3e841bf38ca54208f0" gracePeriod=600 Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.897600 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06dd4615-ecfb-4e00-9dcf-ee18317d1f95/mysql-bootstrap/0.log" Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.949518 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="820fa6e8de38173f81ad2e38f1b477307381a4e192369f3e841bf38ca54208f0" exitCode=0 Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.949573 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"820fa6e8de38173f81ad2e38f1b477307381a4e192369f3e841bf38ca54208f0"} Dec 10 13:04:45 crc kubenswrapper[4852]: I1210 13:04:45.949609 4852 scope.go:117] "RemoveContainer" containerID="8d4b4b8c0467954d7b8eb9bc4c054069abeb500950e4bd4daa8794de055dc142" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.014262 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9065a2ec-b14d-4376-87f7-2305a86dec0c/nova-scheduler-scheduler/0.log" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.086292 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06dd4615-ecfb-4e00-9dcf-ee18317d1f95/mysql-bootstrap/0.log" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.192955 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_06dd4615-ecfb-4e00-9dcf-ee18317d1f95/galera/0.log" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.339382 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d466b79-84c0-42e9-8952-8491b4ced74e/mysql-bootstrap/0.log" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.554051 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d466b79-84c0-42e9-8952-8491b4ced74e/mysql-bootstrap/0.log" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.557746 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2d466b79-84c0-42e9-8952-8491b4ced74e/galera/0.log" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.732083 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_12600c57-0ba3-4781-93cc-317e533e52d8/openstackclient/0.log" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.813077 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-gkhdx_6246b317-7d73-49ff-bd8e-f4862a4584c6/ovn-controller/0.log" Dec 10 13:04:46 crc kubenswrapper[4852]: I1210 13:04:46.959149 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1"} Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.013131 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nspk8_2c06b796-7229-47bc-889c-4a78ef3a186a/openstack-network-exporter/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.109451 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2fdca3d6-4ed5-4e7b-995d-c6c8e40739a3/nova-metadata-metadata/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.156552 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd68p_b4670741-ddce-45cb-aa16-8f7c419f0c89/ovsdb-server-init/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.372893 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd68p_b4670741-ddce-45cb-aa16-8f7c419f0c89/ovsdb-server-init/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.388645 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd68p_b4670741-ddce-45cb-aa16-8f7c419f0c89/ovs-vswitchd/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.425482 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd68p_b4670741-ddce-45cb-aa16-8f7c419f0c89/ovsdb-server/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.641301 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a4999290-010a-43e8-9622-04a117f98f3f/openstack-network-exporter/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.654467 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9pkxx_438ab74a-135c-480f-9335-9e2f4f81c0c2/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.674831 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a4999290-010a-43e8-9622-04a117f98f3f/ovn-northd/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.876810 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0/openstack-network-exporter/0.log" Dec 10 13:04:47 crc kubenswrapper[4852]: I1210 13:04:47.914216 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3fd1e39d-cd48-4c7e-94ff-769ddad0e7f0/ovsdbserver-nb/0.log" Dec 10 13:04:48 crc kubenswrapper[4852]: I1210 13:04:48.035587 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b16645b8-8fa6-46cc-848a-2815e736e9b2/openstack-network-exporter/0.log" Dec 10 13:04:48 crc kubenswrapper[4852]: I1210 13:04:48.112014 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b16645b8-8fa6-46cc-848a-2815e736e9b2/ovsdbserver-sb/0.log" Dec 10 13:04:48 crc kubenswrapper[4852]: I1210 13:04:48.225353 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db947f9b4-m6rgq_1ebd8c65-e675-462a-bdba-db5d0ea01754/placement-api/0.log" Dec 10 13:04:48 crc kubenswrapper[4852]: I1210 13:04:48.354303 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db947f9b4-m6rgq_1ebd8c65-e675-462a-bdba-db5d0ea01754/placement-log/0.log" Dec 10 13:04:48 crc kubenswrapper[4852]: I1210 13:04:48.631338 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_63001b32-e957-4b24-a742-7932191e7598/setup-container/0.log" Dec 10 13:04:48 crc kubenswrapper[4852]: I1210 13:04:48.739009 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_63001b32-e957-4b24-a742-7932191e7598/rabbitmq/0.log" Dec 10 13:04:48 crc kubenswrapper[4852]: I1210 13:04:48.797773 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_63001b32-e957-4b24-a742-7932191e7598/setup-container/0.log" Dec 10 13:04:48 crc kubenswrapper[4852]: I1210 13:04:48.888119 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_280ccc25-3ba2-46ea-b167-19480cb76a48/setup-container/0.log" Dec 10 13:04:49 crc kubenswrapper[4852]: I1210 13:04:49.081458 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_280ccc25-3ba2-46ea-b167-19480cb76a48/rabbitmq/0.log" Dec 10 13:04:49 crc kubenswrapper[4852]: I1210 13:04:49.096032 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_280ccc25-3ba2-46ea-b167-19480cb76a48/setup-container/0.log" Dec 10 13:04:49 crc kubenswrapper[4852]: I1210 13:04:49.136486 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9q74r_f2b026ff-2591-4ac3-9ce3-51b0ab9b20d2/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:49 crc kubenswrapper[4852]: I1210 13:04:49.337635 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pf6kc_2bdc2caf-227b-4210-bdbd-adf085cf4e27/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:49 crc kubenswrapper[4852]: I1210 13:04:49.383397 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cg99j_19836285-fe41-4d6e-8f05-b5aeac635c5c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.273556 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nlmsv_5d7d1222-768a-4615-8aaa-385740584e4e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.288014 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xxr4g_bc600c67-710c-494a-9fb0-866745c0709d/ssh-known-hosts-edpm-deployment/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.480185 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5w2l8_cce2fc32-02ab-4099-ac2f-c0eeca72f9a8/swift-ring-rebalance/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.540177 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8698bf8cd7-bmf4z_a41546b5-9dd3-4400-97ba-4bf433dc2c2c/proxy-server/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.593980 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-8698bf8cd7-bmf4z_a41546b5-9dd3-4400-97ba-4bf433dc2c2c/proxy-httpd/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.763104 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/account-reaper/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.787611 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/account-auditor/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.820046 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/account-replicator/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.960188 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/container-replicator/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.963858 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/container-auditor/0.log" Dec 10 13:04:50 crc kubenswrapper[4852]: I1210 13:04:50.987731 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/account-server/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.042719 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/container-server/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.140658 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/container-updater/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.165869 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-auditor/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.189054 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-expirer/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.290739 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-replicator/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.322410 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-server/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.855865 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/object-updater/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.871785 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/rsync/0.log" Dec 10 13:04:51 crc kubenswrapper[4852]: I1210 13:04:51.893293 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_41d04c65-c8a1-472a-bc74-6b20bec61fbc/swift-recon-cron/0.log" Dec 10 13:04:52 crc kubenswrapper[4852]: I1210 13:04:52.107454 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-k6tcb_33fb43eb-83b5-4e1a-8837-cb9a5bc3e7b2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:04:52 crc kubenswrapper[4852]: I1210 13:04:52.139506 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9238ddbd-fcdf-4612-974f-114508e02356/tempest-tests-tempest-tests-runner/0.log" Dec 10 13:04:52 crc kubenswrapper[4852]: I1210 13:04:52.298345 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ce2aca81-09ab-4dd7-b9b7-d35cef864a73/test-operator-logs-container/0.log" Dec 10 13:04:52 crc kubenswrapper[4852]: I1210 13:04:52.326818 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dlmkb_c8130005-2302-4ea1-8677-b590a256d3ec/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 10 13:05:01 crc kubenswrapper[4852]: I1210 13:05:01.855092 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7a324e51-4ea8-4cca-8cfd-6f64d13cd706/memcached/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.050140 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-bx58k_88a0620c-81a0-4ad1-ae9a-13eb0d08e10f/kube-rbac-proxy/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.193798 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-bx58k_88a0620c-81a0-4ad1-ae9a-13eb0d08e10f/manager/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.236250 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/util/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.436750 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/pull/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.454978 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/util/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.455205 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/pull/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.573253 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/util/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.597772 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/extract/0.log" Dec 10 13:05:19 crc kubenswrapper[4852]: I1210 13:05:19.603659 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bb3f66c3fb8a89cfd808421379279217352ca67f5946ae24217f663e8anqczr_b2331a1e-5a05-454c-8416-5c475817b166/pull/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.222720 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-tlflj_74cd0e4c-bd25-4b22-8b1f-cb3758f446fd/kube-rbac-proxy/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.234670 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-pnppk_97d20a41-52e0-47d5-86fd-0f486080ebf5/kube-rbac-proxy/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.257528 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-pnppk_97d20a41-52e0-47d5-86fd-0f486080ebf5/manager/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.384300 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-tlflj_74cd0e4c-bd25-4b22-8b1f-cb3758f446fd/manager/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.437814 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-rhwzx_62e793a9-5b13-4532-90fe-d3313b3cf4d9/kube-rbac-proxy/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.519414 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-rhwzx_62e793a9-5b13-4532-90fe-d3313b3cf4d9/manager/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.632188 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-j8h26_391832bd-03d9-409e-93a0-b8986ed437ff/kube-rbac-proxy/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.634601 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-j8h26_391832bd-03d9-409e-93a0-b8986ed437ff/manager/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.804059 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-54gtc_f67c3362-3da1-45f6-8fc6-47e16b206173/manager/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.820220 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-54gtc_f67c3362-3da1-45f6-8fc6-47e16b206173/kube-rbac-proxy/0.log" Dec 10 13:05:20 crc kubenswrapper[4852]: I1210 13:05:20.912890 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5p988_3fc3907c-5313-44d8-90dd-155b24156a1b/kube-rbac-proxy/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.065683 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-gbgfx_67ab896e-72eb-4040-9397-2a2bcca37c7e/kube-rbac-proxy/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.138494 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-gbgfx_67ab896e-72eb-4040-9397-2a2bcca37c7e/manager/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.253415 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5p988_3fc3907c-5313-44d8-90dd-155b24156a1b/manager/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.272185 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4kkrb_2a5cb708-ca60-4763-bf61-6562a610e6dc/kube-rbac-proxy/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.384615 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4kkrb_2a5cb708-ca60-4763-bf61-6562a610e6dc/manager/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.492585 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-b2jw2_9c39ec89-c5bf-4cdd-a253-154db7bcf781/kube-rbac-proxy/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.511363 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-b2jw2_9c39ec89-c5bf-4cdd-a253-154db7bcf781/manager/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.691677 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-nlcxk_f53525dc-0dc9-44c5-a947-2e303cb0ed1c/manager/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.713128 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-nlcxk_f53525dc-0dc9-44c5-a947-2e303cb0ed1c/kube-rbac-proxy/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.865780 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pzw5d_79986568-4439-4f2a-9dc4-af5fb1a1d787/kube-rbac-proxy/0.log" Dec 10 13:05:21 crc kubenswrapper[4852]: I1210 13:05:21.987680 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pzw5d_79986568-4439-4f2a-9dc4-af5fb1a1d787/manager/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.160138 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p22mj_bf62d827-9a6d-4a53-9a65-b287195f3bea/manager/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.164078 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p22mj_bf62d827-9a6d-4a53-9a65-b287195f3bea/kube-rbac-proxy/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.188194 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-xr9c5_2a55ad46-c35b-4429-b1da-7a361f7c45d0/kube-rbac-proxy/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.217288 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-xr9c5_2a55ad46-c35b-4429-b1da-7a361f7c45d0/manager/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.340820 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f8csbt_3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2/manager/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.425961 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f8csbt_3ded9ed7-ec7c-4550-a840-3dd4d7c1c4b2/kube-rbac-proxy/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.710081 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nhqss_62488b9a-bd45-4d7e-a890-f2d585698d58/registry-server/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.895144 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5bc74577c9-ch9wf_267779dd-45a9-4ee6-985d-39fb7d7cb207/operator/0.log" Dec 10 13:05:22 crc kubenswrapper[4852]: I1210 13:05:22.937822 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-86mvp_75a5b678-ba48-4191-99e6-aeeaf32bf40e/kube-rbac-proxy/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.040122 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-86mvp_75a5b678-ba48-4191-99e6-aeeaf32bf40e/manager/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.097979 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-kc8c8_785eda15-0a5d-451d-8ec4-b35e1f8d8147/kube-rbac-proxy/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.175886 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-kc8c8_785eda15-0a5d-451d-8ec4-b35e1f8d8147/manager/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.308535 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xkcjc_524d7bc8-a871-4ff2-bc13-1a84d07bb0e9/operator/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.395910 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-q26ll_31cc1af5-d198-472a-aa62-2ce735f4453b/kube-rbac-proxy/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.446098 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-q26ll_31cc1af5-d198-472a-aa62-2ce735f4453b/manager/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.645887 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-lhzps_c7a73ae7-6060-497a-b94f-8988c2244f94/kube-rbac-proxy/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.738903 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d7c94c9c8-s6npl_bbce747f-ad24-476e-8746-f2bb89eba637/manager/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.741035 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-lhzps_c7a73ae7-6060-497a-b94f-8988c2244f94/manager/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.821824 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-zhxkc_6b9c74bb-9c09-4976-be53-8b2c296f7788/kube-rbac-proxy/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.846825 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-zhxkc_6b9c74bb-9c09-4976-be53-8b2c296f7788/manager/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.908223 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-57gb2_d01d86ae-5138-4298-8ec0-7aa8cdd468fe/kube-rbac-proxy/0.log" Dec 10 13:05:23 crc kubenswrapper[4852]: I1210 13:05:23.933405 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-57gb2_d01d86ae-5138-4298-8ec0-7aa8cdd468fe/manager/0.log" Dec 10 13:05:44 crc kubenswrapper[4852]: I1210 13:05:44.020171 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-n8gzr_8bc2ea7c-2f45-49ac-b683-c57d84d8e758/control-plane-machine-set-operator/0.log" Dec 10 13:05:44 crc kubenswrapper[4852]: I1210 13:05:44.206031 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gmm6c_c9c92825-dfcf-4030-8fa7-4326fc350f10/kube-rbac-proxy/0.log" Dec 10 13:05:44 crc kubenswrapper[4852]: I1210 13:05:44.220742 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gmm6c_c9c92825-dfcf-4030-8fa7-4326fc350f10/machine-api-operator/0.log" Dec 10 13:05:58 crc kubenswrapper[4852]: I1210 13:05:58.265714 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nt4zl_bb5429e3-7f2e-4632-b68b-18de65b5e060/cert-manager-controller/0.log" Dec 10 13:05:58 crc kubenswrapper[4852]: I1210 13:05:58.369400 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-z86h2_dbc78ba5-e2a8-444b-ab4f-a5cf34e3cbe4/cert-manager-cainjector/0.log" Dec 10 13:05:58 crc kubenswrapper[4852]: I1210 13:05:58.438095 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-rj8fh_78c79d4e-6293-4789-932e-2c42545750a5/cert-manager-webhook/0.log" Dec 10 13:06:12 crc kubenswrapper[4852]: I1210 13:06:12.228423 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-zzw4g_88acf534-fb28-4e05-bab0-f60364533fae/nmstate-console-plugin/0.log" Dec 10 13:06:12 crc kubenswrapper[4852]: I1210 13:06:12.398695 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pcbgh_21f5475e-7988-44d7-940f-76c59cf92f7e/nmstate-handler/0.log" Dec 10 13:06:12 crc kubenswrapper[4852]: I1210 13:06:12.412416 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-2mcc9_94e816ec-cfe3-413c-98f4-5d6f2880d16f/kube-rbac-proxy/0.log" Dec 10 13:06:12 crc kubenswrapper[4852]: I1210 13:06:12.497847 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-2mcc9_94e816ec-cfe3-413c-98f4-5d6f2880d16f/nmstate-metrics/0.log" Dec 10 13:06:12 crc kubenswrapper[4852]: I1210 13:06:12.593140 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-s8bfr_35feaa98-be47-42f8-af3b-bf8a5ef57ce4/nmstate-operator/0.log" Dec 10 13:06:12 crc kubenswrapper[4852]: I1210 13:06:12.672273 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-hx5dz_5b77c48c-a8a1-440d-8e0d-fab8d2087ede/nmstate-webhook/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.139639 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gj6ss_2335158c-cbc5-45a0-9438-a879aede67f1/kube-rbac-proxy/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.281704 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-gj6ss_2335158c-cbc5-45a0-9438-a879aede67f1/controller/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.424061 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-frr-files/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.614204 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-reloader/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.617890 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-reloader/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.618616 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-frr-files/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.668484 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-metrics/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.818716 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-frr-files/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.827399 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-metrics/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.834861 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-reloader/0.log" Dec 10 13:06:27 crc kubenswrapper[4852]: I1210 13:06:27.864009 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-metrics/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.002021 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-reloader/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.022584 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-frr-files/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.026933 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/cp-metrics/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.059932 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/controller/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.214985 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/kube-rbac-proxy/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.252881 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/frr-metrics/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.255977 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/kube-rbac-proxy-frr/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.442549 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/reloader/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.464655 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-9cr6b_9c84ab71-bcb9-4237-a827-4fe3c1c2c754/frr-k8s-webhook-server/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.658171 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c67dbf94b-rhs8b_356ac40e-2e68-4d75-81ca-b1e3306e263a/manager/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.855114 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d78c58b5f-5mtdv_c92ae4cf-27c3-46d4-9be9-8398e1276f61/webhook-server/0.log" Dec 10 13:06:28 crc kubenswrapper[4852]: I1210 13:06:28.948007 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jfz84_357f1ff0-29a8-4905-bac8-9bc8a5c03199/kube-rbac-proxy/0.log" Dec 10 13:06:29 crc kubenswrapper[4852]: I1210 13:06:29.554006 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jfz84_357f1ff0-29a8-4905-bac8-9bc8a5c03199/speaker/0.log" Dec 10 13:06:29 crc kubenswrapper[4852]: I1210 13:06:29.699892 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pddhf_17cd493c-8f5c-4567-8959-cf6ae0011e51/frr/0.log" Dec 10 13:06:42 crc kubenswrapper[4852]: I1210 13:06:42.839327 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/util/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.015181 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/pull/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.027684 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/util/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.057245 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/pull/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.213306 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/util/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.222308 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/pull/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.247889 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv4wrr_d6c5d4b5-9826-4365-944e-097108097f70/extract/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.378032 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/util/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.530912 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/util/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.548168 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/pull/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.558583 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/pull/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.713740 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/pull/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.766998 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/util/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.771567 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83b6flt_fe5305bc-61f3-4176-902a-5e0c821b9ff3/extract/0.log" Dec 10 13:06:43 crc kubenswrapper[4852]: I1210 13:06:43.896597 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2872c_9151bb1d-ba24-436f-a64f-a40292d34e64/extract-utilities/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.036823 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2872c_9151bb1d-ba24-436f-a64f-a40292d34e64/extract-content/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.073926 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2872c_9151bb1d-ba24-436f-a64f-a40292d34e64/extract-content/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.080010 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2872c_9151bb1d-ba24-436f-a64f-a40292d34e64/extract-utilities/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.227637 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2872c_9151bb1d-ba24-436f-a64f-a40292d34e64/extract-utilities/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.270399 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2872c_9151bb1d-ba24-436f-a64f-a40292d34e64/extract-content/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.459116 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2872c_9151bb1d-ba24-436f-a64f-a40292d34e64/registry-server/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.468614 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xk9cn_69e59a92-b2e5-41c5-ba51-6d8a67b08da1/extract-utilities/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.587754 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xk9cn_69e59a92-b2e5-41c5-ba51-6d8a67b08da1/extract-utilities/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.613777 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xk9cn_69e59a92-b2e5-41c5-ba51-6d8a67b08da1/extract-content/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.614736 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xk9cn_69e59a92-b2e5-41c5-ba51-6d8a67b08da1/extract-content/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.783112 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xk9cn_69e59a92-b2e5-41c5-ba51-6d8a67b08da1/extract-utilities/0.log" Dec 10 13:06:44 crc kubenswrapper[4852]: I1210 13:06:44.811627 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xk9cn_69e59a92-b2e5-41c5-ba51-6d8a67b08da1/extract-content/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.008567 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xk9cn_69e59a92-b2e5-41c5-ba51-6d8a67b08da1/registry-server/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.016827 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-64m8g_ff1e723c-986a-4c70-8340-aee0dacc330d/marketplace-operator/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.093168 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-utilities/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.219554 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-utilities/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.246246 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-content/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.255857 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-content/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.423526 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-content/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.497680 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/extract-utilities/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.573579 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fqx2n_fc96b426-5b87-4797-91be-ae9864b34b82/registry-server/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.639924 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-utilities/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.830399 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-content/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.851686 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-content/0.log" Dec 10 13:06:45 crc kubenswrapper[4852]: I1210 13:06:45.851823 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-utilities/0.log" Dec 10 13:06:46 crc kubenswrapper[4852]: I1210 13:06:46.076895 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-utilities/0.log" Dec 10 13:06:46 crc kubenswrapper[4852]: I1210 13:06:46.096114 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/extract-content/0.log" Dec 10 13:06:46 crc kubenswrapper[4852]: I1210 13:06:46.665694 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4gn7_64eaac3d-ea36-4ea6-90dd-0b376a897f27/registry-server/0.log" Dec 10 13:07:15 crc kubenswrapper[4852]: I1210 13:07:15.798745 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 13:07:15 crc kubenswrapper[4852]: I1210 13:07:15.799193 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 13:07:45 crc kubenswrapper[4852]: I1210 13:07:45.790670 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 13:07:45 crc kubenswrapper[4852]: I1210 13:07:45.791381 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 13:08:15 crc kubenswrapper[4852]: I1210 13:08:15.790263 4852 patch_prober.go:28] interesting pod/machine-config-daemon-thqgh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 10 13:08:15 crc kubenswrapper[4852]: I1210 13:08:15.790753 4852 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 10 13:08:15 crc kubenswrapper[4852]: I1210 13:08:15.790794 4852 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" Dec 10 13:08:15 crc kubenswrapper[4852]: I1210 13:08:15.791559 4852 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1"} pod="openshift-machine-config-operator/machine-config-daemon-thqgh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 10 13:08:15 crc kubenswrapper[4852]: I1210 13:08:15.791608 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerName="machine-config-daemon" containerID="cri-o://c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" gracePeriod=600 Dec 10 13:08:15 crc kubenswrapper[4852]: I1210 13:08:15.921749 4852 generic.go:334] "Generic (PLEG): container finished" podID="06184023-d738-4d23-ae7e-bc0dde135fa2" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" exitCode=0 Dec 10 13:08:15 crc kubenswrapper[4852]: I1210 13:08:15.921831 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerDied","Data":"c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1"} Dec 10 13:08:15 crc kubenswrapper[4852]: I1210 13:08:15.922164 4852 scope.go:117] "RemoveContainer" containerID="820fa6e8de38173f81ad2e38f1b477307381a4e192369f3e841bf38ca54208f0" Dec 10 13:08:15 crc kubenswrapper[4852]: E1210 13:08:15.924771 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:08:16 crc kubenswrapper[4852]: I1210 13:08:16.934731 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:08:16 crc kubenswrapper[4852]: E1210 13:08:16.935167 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:08:30 crc kubenswrapper[4852]: I1210 13:08:30.172943 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:08:30 crc kubenswrapper[4852]: E1210 13:08:30.174649 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:08:33 crc kubenswrapper[4852]: I1210 13:08:33.197761 4852 generic.go:334] "Generic (PLEG): container finished" podID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerID="e15d72cd273d0d0a28cb254d5a55a05efa68cf52be151786cf81367aeff496f2" exitCode=0 Dec 10 13:08:33 crc kubenswrapper[4852]: I1210 13:08:33.197808 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" event={"ID":"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053","Type":"ContainerDied","Data":"e15d72cd273d0d0a28cb254d5a55a05efa68cf52be151786cf81367aeff496f2"} Dec 10 13:08:33 crc kubenswrapper[4852]: I1210 13:08:33.198301 4852 scope.go:117] "RemoveContainer" containerID="e15d72cd273d0d0a28cb254d5a55a05efa68cf52be151786cf81367aeff496f2" Dec 10 13:08:34 crc kubenswrapper[4852]: I1210 13:08:34.134445 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vsjbn_must-gather-nmn5w_ab8dd8ed-b5dc-42b8-a2e5-69cd10671053/gather/0.log" Dec 10 13:08:44 crc kubenswrapper[4852]: I1210 13:08:44.179201 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:08:44 crc kubenswrapper[4852]: E1210 13:08:44.181997 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:08:44 crc kubenswrapper[4852]: I1210 13:08:44.763076 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vsjbn/must-gather-nmn5w"] Dec 10 13:08:44 crc kubenswrapper[4852]: I1210 13:08:44.763465 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" podUID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerName="copy" containerID="cri-o://03e2920ffc888b5bfb93be60dd3dbe221f1a8e81bb308a65f149342c98350342" gracePeriod=2 Dec 10 13:08:44 crc kubenswrapper[4852]: I1210 13:08:44.781584 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vsjbn/must-gather-nmn5w"] Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.334041 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vsjbn_must-gather-nmn5w_ab8dd8ed-b5dc-42b8-a2e5-69cd10671053/copy/0.log" Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.334459 4852 generic.go:334] "Generic (PLEG): container finished" podID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerID="03e2920ffc888b5bfb93be60dd3dbe221f1a8e81bb308a65f149342c98350342" exitCode=143 Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.669820 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vsjbn_must-gather-nmn5w_ab8dd8ed-b5dc-42b8-a2e5-69cd10671053/copy/0.log" Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.670362 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.800171 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-must-gather-output\") pod \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\" (UID: \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\") " Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.800278 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sppw9\" (UniqueName: \"kubernetes.io/projected/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-kube-api-access-sppw9\") pod \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\" (UID: \"ab8dd8ed-b5dc-42b8-a2e5-69cd10671053\") " Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.808496 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-kube-api-access-sppw9" (OuterVolumeSpecName: "kube-api-access-sppw9") pod "ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" (UID: "ab8dd8ed-b5dc-42b8-a2e5-69cd10671053"). InnerVolumeSpecName "kube-api-access-sppw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.906655 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sppw9\" (UniqueName: \"kubernetes.io/projected/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-kube-api-access-sppw9\") on node \"crc\" DevicePath \"\"" Dec 10 13:08:45 crc kubenswrapper[4852]: I1210 13:08:45.988461 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" (UID: "ab8dd8ed-b5dc-42b8-a2e5-69cd10671053"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 13:08:46 crc kubenswrapper[4852]: I1210 13:08:46.008448 4852 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 10 13:08:46 crc kubenswrapper[4852]: I1210 13:08:46.181218 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" path="/var/lib/kubelet/pods/ab8dd8ed-b5dc-42b8-a2e5-69cd10671053/volumes" Dec 10 13:08:46 crc kubenswrapper[4852]: I1210 13:08:46.343345 4852 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vsjbn_must-gather-nmn5w_ab8dd8ed-b5dc-42b8-a2e5-69cd10671053/copy/0.log" Dec 10 13:08:46 crc kubenswrapper[4852]: I1210 13:08:46.343816 4852 scope.go:117] "RemoveContainer" containerID="03e2920ffc888b5bfb93be60dd3dbe221f1a8e81bb308a65f149342c98350342" Dec 10 13:08:46 crc kubenswrapper[4852]: I1210 13:08:46.343935 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vsjbn/must-gather-nmn5w" Dec 10 13:08:46 crc kubenswrapper[4852]: I1210 13:08:46.380865 4852 scope.go:117] "RemoveContainer" containerID="e15d72cd273d0d0a28cb254d5a55a05efa68cf52be151786cf81367aeff496f2" Dec 10 13:08:59 crc kubenswrapper[4852]: I1210 13:08:59.170642 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:08:59 crc kubenswrapper[4852]: E1210 13:08:59.171850 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:09:14 crc kubenswrapper[4852]: I1210 13:09:14.186402 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:09:14 crc kubenswrapper[4852]: E1210 13:09:14.187093 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:09:16 crc kubenswrapper[4852]: I1210 13:09:16.982899 4852 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q7s6w"] Dec 10 13:09:16 crc kubenswrapper[4852]: E1210 13:09:16.983806 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b658dc-e110-41e0-a9e3-415f53f7a84f" containerName="container-00" Dec 10 13:09:16 crc kubenswrapper[4852]: I1210 13:09:16.983819 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b658dc-e110-41e0-a9e3-415f53f7a84f" containerName="container-00" Dec 10 13:09:16 crc kubenswrapper[4852]: E1210 13:09:16.983836 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerName="copy" Dec 10 13:09:16 crc kubenswrapper[4852]: I1210 13:09:16.983843 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerName="copy" Dec 10 13:09:16 crc kubenswrapper[4852]: E1210 13:09:16.983862 4852 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerName="gather" Dec 10 13:09:16 crc kubenswrapper[4852]: I1210 13:09:16.983868 4852 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerName="gather" Dec 10 13:09:16 crc kubenswrapper[4852]: I1210 13:09:16.984135 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerName="gather" Dec 10 13:09:16 crc kubenswrapper[4852]: I1210 13:09:16.984156 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b658dc-e110-41e0-a9e3-415f53f7a84f" containerName="container-00" Dec 10 13:09:16 crc kubenswrapper[4852]: I1210 13:09:16.984169 4852 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab8dd8ed-b5dc-42b8-a2e5-69cd10671053" containerName="copy" Dec 10 13:09:16 crc kubenswrapper[4852]: I1210 13:09:16.985828 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.029441 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7s6w"] Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.132752 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8rj8\" (UniqueName: \"kubernetes.io/projected/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-kube-api-access-d8rj8\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.133581 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-utilities\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.133886 4852 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-catalog-content\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.235898 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8rj8\" (UniqueName: \"kubernetes.io/projected/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-kube-api-access-d8rj8\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.235964 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-utilities\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.236027 4852 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-catalog-content\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.236555 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-catalog-content\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.236604 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-utilities\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.255865 4852 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8rj8\" (UniqueName: \"kubernetes.io/projected/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-kube-api-access-d8rj8\") pod \"certified-operators-q7s6w\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.319197 4852 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:17 crc kubenswrapper[4852]: I1210 13:09:17.822291 4852 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7s6w"] Dec 10 13:09:18 crc kubenswrapper[4852]: I1210 13:09:18.657929 4852 generic.go:334] "Generic (PLEG): container finished" podID="58f78ba9-e01c-47ff-a56a-1b8bb8986e58" containerID="f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234" exitCode=0 Dec 10 13:09:18 crc kubenswrapper[4852]: I1210 13:09:18.658272 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7s6w" event={"ID":"58f78ba9-e01c-47ff-a56a-1b8bb8986e58","Type":"ContainerDied","Data":"f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234"} Dec 10 13:09:18 crc kubenswrapper[4852]: I1210 13:09:18.658365 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7s6w" event={"ID":"58f78ba9-e01c-47ff-a56a-1b8bb8986e58","Type":"ContainerStarted","Data":"4070a8bdaabd477619b6867b03aaeef88f66360848ba1dc16d315facbcc85d25"} Dec 10 13:09:18 crc kubenswrapper[4852]: I1210 13:09:18.661465 4852 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 10 13:09:19 crc kubenswrapper[4852]: I1210 13:09:19.671642 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7s6w" event={"ID":"58f78ba9-e01c-47ff-a56a-1b8bb8986e58","Type":"ContainerStarted","Data":"4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8"} Dec 10 13:09:20 crc kubenswrapper[4852]: I1210 13:09:20.683836 4852 generic.go:334] "Generic (PLEG): container finished" podID="58f78ba9-e01c-47ff-a56a-1b8bb8986e58" containerID="4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8" exitCode=0 Dec 10 13:09:20 crc kubenswrapper[4852]: I1210 13:09:20.683983 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7s6w" event={"ID":"58f78ba9-e01c-47ff-a56a-1b8bb8986e58","Type":"ContainerDied","Data":"4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8"} Dec 10 13:09:23 crc kubenswrapper[4852]: I1210 13:09:23.729028 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7s6w" event={"ID":"58f78ba9-e01c-47ff-a56a-1b8bb8986e58","Type":"ContainerStarted","Data":"37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3"} Dec 10 13:09:23 crc kubenswrapper[4852]: I1210 13:09:23.758482 4852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q7s6w" podStartSLOduration=3.662843874 podStartE2EDuration="7.758449749s" podCreationTimestamp="2025-12-10 13:09:16 +0000 UTC" firstStartedPulling="2025-12-10 13:09:18.661193741 +0000 UTC m=+4644.746718965" lastFinishedPulling="2025-12-10 13:09:22.756799616 +0000 UTC m=+4648.842324840" observedRunningTime="2025-12-10 13:09:23.749833579 +0000 UTC m=+4649.835358803" watchObservedRunningTime="2025-12-10 13:09:23.758449749 +0000 UTC m=+4649.843974973" Dec 10 13:09:27 crc kubenswrapper[4852]: I1210 13:09:27.170032 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:09:27 crc kubenswrapper[4852]: E1210 13:09:27.170706 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:09:27 crc kubenswrapper[4852]: I1210 13:09:27.320310 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:27 crc kubenswrapper[4852]: I1210 13:09:27.320375 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:27 crc kubenswrapper[4852]: I1210 13:09:27.370170 4852 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:27 crc kubenswrapper[4852]: I1210 13:09:27.824578 4852 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:27 crc kubenswrapper[4852]: I1210 13:09:27.873657 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7s6w"] Dec 10 13:09:29 crc kubenswrapper[4852]: I1210 13:09:29.797396 4852 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q7s6w" podUID="58f78ba9-e01c-47ff-a56a-1b8bb8986e58" containerName="registry-server" containerID="cri-o://37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3" gracePeriod=2 Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.255634 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.449427 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8rj8\" (UniqueName: \"kubernetes.io/projected/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-kube-api-access-d8rj8\") pod \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.449683 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-catalog-content\") pod \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.449728 4852 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-utilities\") pod \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\" (UID: \"58f78ba9-e01c-47ff-a56a-1b8bb8986e58\") " Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.451557 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-utilities" (OuterVolumeSpecName: "utilities") pod "58f78ba9-e01c-47ff-a56a-1b8bb8986e58" (UID: "58f78ba9-e01c-47ff-a56a-1b8bb8986e58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.454487 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-kube-api-access-d8rj8" (OuterVolumeSpecName: "kube-api-access-d8rj8") pod "58f78ba9-e01c-47ff-a56a-1b8bb8986e58" (UID: "58f78ba9-e01c-47ff-a56a-1b8bb8986e58"). InnerVolumeSpecName "kube-api-access-d8rj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.552515 4852 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-utilities\") on node \"crc\" DevicePath \"\"" Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.552559 4852 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8rj8\" (UniqueName: \"kubernetes.io/projected/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-kube-api-access-d8rj8\") on node \"crc\" DevicePath \"\"" Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.810122 4852 generic.go:334] "Generic (PLEG): container finished" podID="58f78ba9-e01c-47ff-a56a-1b8bb8986e58" containerID="37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3" exitCode=0 Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.810187 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7s6w" event={"ID":"58f78ba9-e01c-47ff-a56a-1b8bb8986e58","Type":"ContainerDied","Data":"37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3"} Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.810225 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7s6w" event={"ID":"58f78ba9-e01c-47ff-a56a-1b8bb8986e58","Type":"ContainerDied","Data":"4070a8bdaabd477619b6867b03aaeef88f66360848ba1dc16d315facbcc85d25"} Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.810258 4852 scope.go:117] "RemoveContainer" containerID="37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3" Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.811738 4852 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7s6w" Dec 10 13:09:30 crc kubenswrapper[4852]: I1210 13:09:30.830056 4852 scope.go:117] "RemoveContainer" containerID="4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8" Dec 10 13:09:31 crc kubenswrapper[4852]: I1210 13:09:31.270982 4852 scope.go:117] "RemoveContainer" containerID="f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234" Dec 10 13:09:31 crc kubenswrapper[4852]: I1210 13:09:31.316875 4852 scope.go:117] "RemoveContainer" containerID="37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3" Dec 10 13:09:31 crc kubenswrapper[4852]: E1210 13:09:31.317438 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3\": container with ID starting with 37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3 not found: ID does not exist" containerID="37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3" Dec 10 13:09:31 crc kubenswrapper[4852]: I1210 13:09:31.317480 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3"} err="failed to get container status \"37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3\": rpc error: code = NotFound desc = could not find container \"37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3\": container with ID starting with 37393ff7e9f2e3ebb50bc24f60362d84b73a8b13a0d8f5a9fa145d32c0e22fb3 not found: ID does not exist" Dec 10 13:09:31 crc kubenswrapper[4852]: I1210 13:09:31.317508 4852 scope.go:117] "RemoveContainer" containerID="4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8" Dec 10 13:09:31 crc kubenswrapper[4852]: E1210 13:09:31.317789 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8\": container with ID starting with 4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8 not found: ID does not exist" containerID="4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8" Dec 10 13:09:31 crc kubenswrapper[4852]: I1210 13:09:31.317820 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8"} err="failed to get container status \"4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8\": rpc error: code = NotFound desc = could not find container \"4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8\": container with ID starting with 4e584c84a6f8f576a3465786616bd2e419481f2459c23af2146fdea565fbd2f8 not found: ID does not exist" Dec 10 13:09:31 crc kubenswrapper[4852]: I1210 13:09:31.317840 4852 scope.go:117] "RemoveContainer" containerID="f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234" Dec 10 13:09:31 crc kubenswrapper[4852]: E1210 13:09:31.318357 4852 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234\": container with ID starting with f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234 not found: ID does not exist" containerID="f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234" Dec 10 13:09:31 crc kubenswrapper[4852]: I1210 13:09:31.318389 4852 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234"} err="failed to get container status \"f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234\": rpc error: code = NotFound desc = could not find container \"f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234\": container with ID starting with f0a2291206fc2bcacb18e7b1c730787f975b2ae0587f32a720fbc322bb2af234 not found: ID does not exist" Dec 10 13:09:32 crc kubenswrapper[4852]: I1210 13:09:32.520461 4852 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58f78ba9-e01c-47ff-a56a-1b8bb8986e58" (UID: "58f78ba9-e01c-47ff-a56a-1b8bb8986e58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 10 13:09:32 crc kubenswrapper[4852]: I1210 13:09:32.590649 4852 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f78ba9-e01c-47ff-a56a-1b8bb8986e58-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 10 13:09:32 crc kubenswrapper[4852]: I1210 13:09:32.648776 4852 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7s6w"] Dec 10 13:09:32 crc kubenswrapper[4852]: I1210 13:09:32.658955 4852 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q7s6w"] Dec 10 13:09:34 crc kubenswrapper[4852]: I1210 13:09:34.185343 4852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58f78ba9-e01c-47ff-a56a-1b8bb8986e58" path="/var/lib/kubelet/pods/58f78ba9-e01c-47ff-a56a-1b8bb8986e58/volumes" Dec 10 13:09:39 crc kubenswrapper[4852]: I1210 13:09:39.171087 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:09:39 crc kubenswrapper[4852]: E1210 13:09:39.172649 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:09:52 crc kubenswrapper[4852]: I1210 13:09:52.169800 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:09:52 crc kubenswrapper[4852]: E1210 13:09:52.170482 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:10:05 crc kubenswrapper[4852]: I1210 13:10:05.169458 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:10:05 crc kubenswrapper[4852]: E1210 13:10:05.170090 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:10:20 crc kubenswrapper[4852]: I1210 13:10:20.169929 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:10:20 crc kubenswrapper[4852]: E1210 13:10:20.171062 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:10:31 crc kubenswrapper[4852]: I1210 13:10:31.170419 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:10:31 crc kubenswrapper[4852]: E1210 13:10:31.171190 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:10:44 crc kubenswrapper[4852]: I1210 13:10:44.175919 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:10:44 crc kubenswrapper[4852]: E1210 13:10:44.176679 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:10:57 crc kubenswrapper[4852]: I1210 13:10:57.169510 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:10:57 crc kubenswrapper[4852]: E1210 13:10:57.170269 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:11:09 crc kubenswrapper[4852]: I1210 13:11:09.170319 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:11:09 crc kubenswrapper[4852]: E1210 13:11:09.171770 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:11:24 crc kubenswrapper[4852]: I1210 13:11:24.183162 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:11:24 crc kubenswrapper[4852]: E1210 13:11:24.184040 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:11:37 crc kubenswrapper[4852]: I1210 13:11:37.169415 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:11:37 crc kubenswrapper[4852]: E1210 13:11:37.170055 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:11:48 crc kubenswrapper[4852]: I1210 13:11:48.169639 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:11:48 crc kubenswrapper[4852]: E1210 13:11:48.171570 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:12:01 crc kubenswrapper[4852]: I1210 13:12:01.170184 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:12:01 crc kubenswrapper[4852]: E1210 13:12:01.170952 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:12:16 crc kubenswrapper[4852]: I1210 13:12:16.170675 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:12:16 crc kubenswrapper[4852]: E1210 13:12:16.171952 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:12:29 crc kubenswrapper[4852]: I1210 13:12:29.170206 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:12:29 crc kubenswrapper[4852]: E1210 13:12:29.171366 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:12:42 crc kubenswrapper[4852]: I1210 13:12:42.173595 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:12:42 crc kubenswrapper[4852]: E1210 13:12:42.174671 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:12:57 crc kubenswrapper[4852]: I1210 13:12:57.170225 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:12:57 crc kubenswrapper[4852]: E1210 13:12:57.171010 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:13:11 crc kubenswrapper[4852]: I1210 13:13:11.170219 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:13:11 crc kubenswrapper[4852]: E1210 13:13:11.171381 4852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-thqgh_openshift-machine-config-operator(06184023-d738-4d23-ae7e-bc0dde135fa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" podUID="06184023-d738-4d23-ae7e-bc0dde135fa2" Dec 10 13:13:26 crc kubenswrapper[4852]: I1210 13:13:26.170008 4852 scope.go:117] "RemoveContainer" containerID="c6dcf1ed33d3e151b2f88d0c6f3963df5470363c7772e769d487078996c64eb1" Dec 10 13:13:27 crc kubenswrapper[4852]: I1210 13:13:27.446637 4852 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-thqgh" event={"ID":"06184023-d738-4d23-ae7e-bc0dde135fa2","Type":"ContainerStarted","Data":"6d007e6eda7b017dd188439990d6e831c5ad615d08c613f17c5451568c2113ea"}